4-Room Spatial Navigation Task Analyses

This notebook contains the analyses necessary to generate the results for the 4-Room Spatial Navigation Task. The task outputs a plain-text format log file which first needs to be converted to the appropriate intermediate representations (the iPosition format for test results and an intermediate navigation format for study/test navigation results).

Note: Many of these processing steps can take quite a long time to run as they're processing several gigabytes of data. This issue is compounded if the data is being held on a server. If the intermediate data files have already been generated, it is advisable to skip the steps which regenerate them. These slow code blocks are marked with a note box.

Data Directories

In order to run most cells in this notebook, these data directories need to be defined (and filled with the appropriate data - see the Data Conversion section below to generate the data_output_directory contents).


In [1]:
import os

data_directory = r'S:\Work\Virtual Navigation Data'
data_output_directory = r'C:\Users\Kevin\Documents\GitHub\msl-iposition-pipeline\examples\saved_data\4-room-iposition'
intermediate_files_directory = r'C:\Users\Kevin\Documents\GitHub\msl-iposition-pipeline\examples\saved_data\4-room-intermediate-files'

generate_intermediate_files = False  # If True, the cells which generate intermediate files will run (they are very slow)
generate_embedded_animations = False  # If True, the embedded animations will be generated such that they are saved in the file

Intermediate Files


In [2]:
import os

def join_intermediate_path(filename):
    return os.path.join(intermediate_files_directory, filename)

if not os.path.exists(intermediate_files_directory):
    os.makedirs(intermediate_files_directory)

segmentation_filename = join_intermediate_path(              '4-room_segmentation_analysis.csv'      )
test_cogrecon_filename = join_intermediate_path(             'Holodeck 4-Room Spatial Navigation.csv')
misassignment_intermediate_filename = join_intermediate_path('4-room_misassignments_by_context.csv'  )
cbe_intermediate_filename = join_intermediate_path(          '4-room_cbe.csv'                        )
nav_intermediate_filename = join_intermediate_path(          '4room_navigation_summary.csv'          )
nav_context_intermediate_filename = join_intermediate_path(  '4room_navigation_contexts_summary.csv' )
full_dataset_filename = join_intermediate_path(              '4room_full_dataset.csv'                )
output_directory = '2018-04-25_16-55-14' # This is not needed if intermediate files are being regenerated

Data Conversion

First we'll process the data into the appropriate formats.

Note: The next cell can take a significant amount of time to run and should only be run on an **empty** directory (otherwise it will append to the files in the directory).

To run the cell, set generate_intermediate_files to True (in [Data Directories](#Data-Directories)).

In [3]:
import cogrecon.core.data_flexing.spatial_navigation.spatial_navigation_analytics as analytics
import os

if 'generate_intermediate_files' in vars() and generate_intermediate_files:
    tmp = [name for name in os.listdir('.') if os.path.isdir(name)]
    analytics.generate_intermediate_files(data_directory, full_study_path=True, full_study_look=False, full_test_path=False,
                                full_test_look=False, full_practice_path=False, full_practice_look=False,
                                full_test_2d=False, full_test_vr=True)
    output_directory = list(set([name for name in os.listdir('.') if os.path.isdir(name)]) - set(tmp))[0]
Note: The next cell can take a significant amount of time to run. If the intermediate data file (*time_travel_task_test_summary.csv* by default) already exists, consider skipping the next cell.

To run the cell, set generate_intermediate_files to True (in [Data Directories](#Data-Directories)).

In [4]:
from cogrecon.core.batch_pipeline import batch_pipeline
import cogrecon.core.data_flexing.spatial_navigation.spatial_navigation_analytics as analytics
import os

if 'generate_intermediate_files' in vars() and generate_intermediate_files:
    intermediate_files_dir = os.path.join('.', output_directory)
    analytics.generate_segmentation_analysis(os.path.join(output_directory, 'vr_test.csv'), segmentation_filename)
    analytics.convert_to_iposition(os.path.join(output_directory, 'vr_test.csv'), data_output_directory)
    os.remove(os.path.join(data_output_directory, 'categories.txt.bak'))
    os.rename(os.path.join(data_output_directory, 'categories.txt'), os.path.join(data_output_directory, 'categories.txt.bak')) # To avoid confusion over which categories to use

Test Data Analyses

This subsection contains the analyses which generate basic statistics of interest for the following values:

  • Misplacement
  • Misassignments
  • Context-Boundary Effects (in Space)
Note that this section requires the intermediate files (generated above) to execute.

In [5]:
from cogrecon.core.batch_pipeline import batch_pipeline
import datetime
import logging

batch_pipeline(data_output_directory,
               datetime.datetime.now().strftime(test_cogrecon_filename), collapse_trials=False,
               trial_by_trial_accuracy=False)


2018-04-26 15:03:59 DESKTOP-LKC15NF root[48320] INFO Finding files in folder C:\Users\Kevin\Documents\GitHub\msl-iposition-pipeline\examples\saved_data\4-room-iposition.
2018-04-26 15:04:00 DESKTOP-LKC15NF root[48320] INFO Found 44 data files in 0.0450000762939 seconds.
2018-04-26 15:04:00 DESKTOP-LKC15NF root[48320] INFO Parsing files with expected shape None.
2018-04-26 15:04:01 DESKTOP-LKC15NF root[48320] INFO The following ids were found and are being processed: ['001', '003', '004', '006', '009', '010', '011', '012', '013', '014', '016', '017', '018', '020', '021', '022', '023', '025', '101', '102', '103', '104', '105', '108', '110', '112', '113', '114', '115', '116', '123', '124', '125', '127', '128', '129', '131', '134', '135', '145', '146', '147', '148', '149']
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['001'] : The transformation function did not reduce the error, removing rotation and retying (old_error=116.577730726, new_error=128.399161709).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] WARNING ['001'] : The transformation function did not reduce the error, removing transform (old_error=116.577730726, new_error=126.785944188).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['001'] : The transformation function did not reduce the error, removing rotation and retying (old_error=99.9597965218, new_error=105.1139933).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['001'] : The transformation function did not reduce the error, removing rotation and retying (old_error=80.4215905037, new_error=86.9995647442).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] WARNING ['001'] : The transformation function did not reduce the error, removing transform (old_error=80.4215905037, new_error=81.6396265831).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['003'] : The transformation function did not reduce the error, removing rotation and retying (old_error=198.894664647, new_error=261.160775112).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] WARNING ['003'] : The transformation function did not reduce the error, removing transform (old_error=198.894664647, new_error=208.584822141).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['003'] : The transformation function did not reduce the error, removing rotation and retying (old_error=141.230927409, new_error=226.828671474).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] WARNING ['003'] : The transformation function did not reduce the error, removing transform (old_error=141.230927409, new_error=164.448206235).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['003'] : The transformation function did not reduce the error, removing rotation and retying (old_error=144.667117486, new_error=175.837708416).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['004'] : The transformation function did not reduce the error, removing rotation and retying (old_error=170.118856338, new_error=172.664607005).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['004'] : The transformation function did not reduce the error, removing rotation and retying (old_error=191.200971295, new_error=201.763020743).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['004'] : The transformation function did not reduce the error, removing rotation and retying (old_error=154.455812337, new_error=178.460085935).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] WARNING ['004'] : The transformation function did not reduce the error, removing transform (old_error=154.455812337, new_error=155.80298783).
2018-04-26 15:04:02 DESKTOP-LKC15NF root[48320] INFO ['004'] : The transformation function did not reduce the error, removing rotation and retying (old_error=186.628705088, new_error=207.358102853).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['006'] : The transformation function did not reduce the error, removing rotation and retying (old_error=131.498696991, new_error=137.244322619).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] WARNING ['006'] : The transformation function did not reduce the error, removing transform (old_error=131.498696991, new_error=131.864780718).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['006'] : The transformation function did not reduce the error, removing rotation and retying (old_error=126.521351058, new_error=156.03687753).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] WARNING ['006'] : The transformation function did not reduce the error, removing transform (old_error=126.521351058, new_error=130.809668318).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['009'] : The transformation function did not reduce the error, removing rotation and retying (old_error=82.5326892745, new_error=83.7491027405).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['009'] : The transformation function did not reduce the error, removing rotation and retying (old_error=125.20457533, new_error=192.898673116).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] WARNING ['009'] : The transformation function did not reduce the error, removing transform (old_error=125.20457533, new_error=138.806242523).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['009'] : The transformation function did not reduce the error, removing rotation and retying (old_error=126.030997656, new_error=199.043563348).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] WARNING ['009'] : The transformation function did not reduce the error, removing transform (old_error=126.030997656, new_error=147.147704668).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['009'] : The transformation function did not reduce the error, removing rotation and retying (old_error=57.3645076338, new_error=75.197233351).
2018-04-26 15:04:03 DESKTOP-LKC15NF root[48320] INFO ['010'] : The transformation function did not reduce the error, removing rotation and retying (old_error=140.738547401, new_error=162.216438172).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['010'] : The transformation function did not reduce the error, removing transform (old_error=140.738547401, new_error=141.435009974).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['010'] : The transformation function did not reduce the error, removing rotation and retying (old_error=76.8092144309, new_error=105.230300454).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['010'] : The transformation function did not reduce the error, removing transform (old_error=76.8092144309, new_error=82.4754405786).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['011'] : The transformation function did not reduce the error, removing rotation and retying (old_error=110.011268277, new_error=151.677376742).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['011'] : The transformation function did not reduce the error, removing transform (old_error=110.011268277, new_error=121.089681122).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['011'] : The transformation function did not reduce the error, removing rotation and retying (old_error=97.1212065094, new_error=105.855002528).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['011'] : The transformation function did not reduce the error, removing rotation and retying (old_error=98.410291124, new_error=125.02900472).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['011'] : The transformation function did not reduce the error, removing transform (old_error=98.410291124, new_error=108.561186698).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['011'] : The transformation function did not reduce the error, removing rotation and retying (old_error=63.9660377205, new_error=92.121412117).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['011'] : The transformation function did not reduce the error, removing transform (old_error=63.9660377205, new_error=70.3346497695).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['012'] : The transformation function did not reduce the error, removing rotation and retying (old_error=150.181860679, new_error=239.755475577).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['012'] : The transformation function did not reduce the error, removing transform (old_error=150.181860679, new_error=190.358768269).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['012'] : The transformation function did not reduce the error, removing rotation and retying (old_error=116.00897283, new_error=140.235256812).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] INFO ['012'] : The transformation function did not reduce the error, removing rotation and retying (old_error=123.620787363, new_error=175.674803127).
2018-04-26 15:04:04 DESKTOP-LKC15NF root[48320] WARNING ['012'] : The transformation function did not reduce the error, removing transform (old_error=123.620787363, new_error=131.731108483).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['013'] : The transformation function did not reduce the error, removing rotation and retying (old_error=68.7660394819, new_error=129.296050978).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] WARNING ['013'] : The transformation function did not reduce the error, removing transform (old_error=68.7660394819, new_error=76.5965497919).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['013'] : The transformation function did not reduce the error, removing rotation and retying (old_error=59.9014244381, new_error=104.359658864).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] WARNING ['013'] : The transformation function did not reduce the error, removing transform (old_error=59.9014244381, new_error=72.0281269551).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['013'] : The transformation function did not reduce the error, removing rotation and retying (old_error=51.0105568874, new_error=78.1062307109).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] WARNING ['013'] : The transformation function did not reduce the error, removing transform (old_error=51.0105568874, new_error=55.9412660009).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['013'] : The transformation function did not reduce the error, removing rotation and retying (old_error=50.188576608, new_error=85.0158853325).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] WARNING ['013'] : The transformation function did not reduce the error, removing transform (old_error=50.188576608, new_error=55.3301782433).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['014'] : The transformation function did not reduce the error, removing rotation and retying (old_error=118.89650437, new_error=125.779339434).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['014'] : The transformation function did not reduce the error, removing rotation and retying (old_error=118.690964896, new_error=239.668009645).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] WARNING ['014'] : The transformation function did not reduce the error, removing transform (old_error=118.690964896, new_error=145.787395104).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] INFO ['014'] : The transformation function did not reduce the error, removing rotation and retying (old_error=101.034075257, new_error=164.454579451).
2018-04-26 15:04:05 DESKTOP-LKC15NF root[48320] WARNING ['014'] : The transformation function did not reduce the error, removing transform (old_error=101.034075257, new_error=110.852117981).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['016'] : The transformation function did not reduce the error, removing rotation and retying (old_error=160.67346296, new_error=247.310733092).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['016'] : The transformation function did not reduce the error, removing transform (old_error=160.67346296, new_error=174.825332032).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['016'] : The transformation function did not reduce the error, removing rotation and retying (old_error=143.369282032, new_error=174.095713141).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['016'] : The transformation function did not reduce the error, removing rotation and retying (old_error=178.734611998, new_error=242.005027028).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['016'] : The transformation function did not reduce the error, removing transform (old_error=178.734611998, new_error=191.07506316).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['017'] : The transformation function did not reduce the error, removing rotation and retying (old_error=134.936289196, new_error=152.870078212).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['017'] : The transformation function did not reduce the error, removing transform (old_error=134.936289196, new_error=135.248435611).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['017'] : The transformation function did not reduce the error, removing rotation and retying (old_error=68.8413837505, new_error=72.7395208784).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['017'] : The transformation function did not reduce the error, removing rotation and retying (old_error=60.3560041077, new_error=63.8526513629).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['017'] : The transformation function did not reduce the error, removing rotation and retying (old_error=98.5263519279, new_error=147.908589379).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['017'] : The transformation function did not reduce the error, removing transform (old_error=98.5263519279, new_error=107.862870057).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['018'] : The transformation function did not reduce the error, removing rotation and retying (old_error=159.076558831, new_error=172.016296226).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['018'] : The transformation function did not reduce the error, removing transform (old_error=159.076558831, new_error=162.218313534).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['018'] : The transformation function did not reduce the error, removing rotation and retying (old_error=166.035587264, new_error=219.684894653).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['018'] : The transformation function did not reduce the error, removing transform (old_error=166.035587264, new_error=178.567944477).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['018'] : The transformation function did not reduce the error, removing rotation and retying (old_error=139.246714618, new_error=154.474306218).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['018'] : The transformation function did not reduce the error, removing transform (old_error=139.246714618, new_error=139.591090974).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] INFO ['018'] : The transformation function did not reduce the error, removing rotation and retying (old_error=167.943345566, new_error=215.794734654).
2018-04-26 15:04:06 DESKTOP-LKC15NF root[48320] WARNING ['018'] : The transformation function did not reduce the error, removing transform (old_error=167.943345566, new_error=177.92610791).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] INFO ['020'] : The transformation function did not reduce the error, removing rotation and retying (old_error=141.76721208, new_error=162.366762032).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] WARNING ['020'] : The transformation function did not reduce the error, removing transform (old_error=141.76721208, new_error=142.566500939).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] INFO ['020'] : The transformation function did not reduce the error, removing rotation and retying (old_error=92.508179698, new_error=103.904403516).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] INFO ['020'] : The transformation function did not reduce the error, removing rotation and retying (old_error=122.080288794, new_error=122.192491471).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] WARNING ['020'] : The transformation function did not reduce the error, removing transform (old_error=122.080288794, new_error=122.560120703).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] INFO ['020'] : The transformation function did not reduce the error, removing rotation and retying (old_error=89.9426496067, new_error=119.772570909).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] WARNING ['020'] : The transformation function did not reduce the error, removing transform (old_error=89.9426496067, new_error=93.7455742709).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] INFO ['021'] : The transformation function did not reduce the error, removing rotation and retying (old_error=204.772207029, new_error=209.143433788).
2018-04-26 15:04:07 DESKTOP-LKC15NF root[48320] INFO ['021'] : The transformation function did not reduce the error, removing rotation and retying (old_error=127.275923959, new_error=130.386377444).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['022'] : The transformation function did not reduce the error, removing rotation and retying (old_error=199.322209239, new_error=264.549021312).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['022'] : The transformation function did not reduce the error, removing transform (old_error=199.322209239, new_error=207.063503215).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['022'] : The transformation function did not reduce the error, removing rotation and retying (old_error=133.756769127, new_error=189.099870333).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['022'] : The transformation function did not reduce the error, removing transform (old_error=133.756769127, new_error=141.940404624).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['022'] : The transformation function did not reduce the error, removing rotation and retying (old_error=149.590800992, new_error=161.373363661).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['023'] : The transformation function did not reduce the error, removing rotation and retying (old_error=168.162011761, new_error=239.904333691).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['023'] : The transformation function did not reduce the error, removing transform (old_error=168.162011761, new_error=205.022573818).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['023'] : The transformation function did not reduce the error, removing rotation and retying (old_error=134.651198631, new_error=228.671429869).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['023'] : The transformation function did not reduce the error, removing transform (old_error=134.651198631, new_error=156.9397946).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['023'] : The transformation function did not reduce the error, removing rotation and retying (old_error=116.966812554, new_error=160.018673758).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['023'] : The transformation function did not reduce the error, removing transform (old_error=116.966812554, new_error=128.612300339).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['025'] : The transformation function did not reduce the error, removing rotation and retying (old_error=110.527272036, new_error=144.252304109).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['025'] : The transformation function did not reduce the error, removing transform (old_error=110.527272036, new_error=118.105321208).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['025'] : The transformation function did not reduce the error, removing rotation and retying (old_error=75.7267080697, new_error=173.448968478).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] WARNING ['025'] : The transformation function did not reduce the error, removing transform (old_error=75.7267080697, new_error=103.28111511).
2018-04-26 15:04:08 DESKTOP-LKC15NF root[48320] INFO ['025'] : The transformation function did not reduce the error, removing rotation and retying (old_error=102.881752178, new_error=128.885411359).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['025'] : The transformation function did not reduce the error, removing rotation and retying (old_error=105.021084333, new_error=147.300967862).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] WARNING ['025'] : The transformation function did not reduce the error, removing transform (old_error=105.021084333, new_error=113.459353026).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['101'] : The transformation function did not reduce the error, removing rotation and retying (old_error=141.480417872, new_error=163.746748497).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['101'] : The transformation function did not reduce the error, removing rotation and retying (old_error=96.4806077587, new_error=144.753196801).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] WARNING ['101'] : The transformation function did not reduce the error, removing transform (old_error=96.4806077587, new_error=109.989563432).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['101'] : The transformation function did not reduce the error, removing rotation and retying (old_error=110.526397199, new_error=255.749189287).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] WARNING ['101'] : The transformation function did not reduce the error, removing transform (old_error=110.526397199, new_error=144.040928416).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['101'] : The transformation function did not reduce the error, removing rotation and retying (old_error=94.2832501571, new_error=146.644080841).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] WARNING ['101'] : The transformation function did not reduce the error, removing transform (old_error=94.2832501571, new_error=104.137815828).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['102'] : The transformation function did not reduce the error, removing rotation and retying (old_error=151.807700566, new_error=188.435734251).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] INFO ['102'] : The transformation function did not reduce the error, removing rotation and retying (old_error=146.145284355, new_error=192.514770636).
2018-04-26 15:04:09 DESKTOP-LKC15NF root[48320] WARNING ['102'] : The transformation function did not reduce the error, removing transform (old_error=146.145284355, new_error=153.436505107).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] INFO ['103'] : The transformation function did not reduce the error, removing rotation and retying (old_error=204.633288105, new_error=241.293156643).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] WARNING ['103'] : The transformation function did not reduce the error, removing transform (old_error=204.633288105, new_error=209.371971598).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] INFO ['103'] : The transformation function did not reduce the error, removing rotation and retying (old_error=118.646579165, new_error=177.682854231).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] WARNING ['103'] : The transformation function did not reduce the error, removing transform (old_error=118.646579165, new_error=128.01906343).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] INFO ['104'] : The transformation function did not reduce the error, removing rotation and retying (old_error=232.949446785, new_error=322.767539926).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] WARNING ['104'] : The transformation function did not reduce the error, removing transform (old_error=232.949446785, new_error=255.863696852).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] INFO ['105'] : The transformation function did not reduce the error, removing rotation and retying (old_error=140.412735452, new_error=209.198408651).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] WARNING ['105'] : The transformation function did not reduce the error, removing transform (old_error=140.412735452, new_error=157.668289903).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] INFO ['105'] : The transformation function did not reduce the error, removing rotation and retying (old_error=107.765432199, new_error=181.197438277).
2018-04-26 15:04:10 DESKTOP-LKC15NF root[48320] WARNING ['105'] : The transformation function did not reduce the error, removing transform (old_error=107.765432199, new_error=120.129527287).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] INFO ['108'] : The transformation function did not reduce the error, removing rotation and retying (old_error=310.445138753, new_error=667.260432596).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] WARNING ['108'] : The transformation function did not reduce the error, removing transform (old_error=310.445138753, new_error=461.542685291).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] INFO ['108'] : The transformation function did not reduce the error, removing rotation and retying (old_error=128.36586763, new_error=152.415908545).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] INFO ['108'] : The transformation function did not reduce the error, removing rotation and retying (old_error=63.6499062195, new_error=91.3072492604).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] INFO ['110'] : The transformation function did not reduce the error, removing rotation and retying (old_error=321.305657174, new_error=366.961647973).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] WARNING ['110'] : The transformation function did not reduce the error, removing transform (old_error=321.305657174, new_error=341.860570416).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] INFO ['110'] : The transformation function did not reduce the error, removing rotation and retying (old_error=243.242658432, new_error=438.376749491).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] WARNING ['110'] : The transformation function did not reduce the error, removing transform (old_error=243.242658432, new_error=315.789010077).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] INFO ['110'] : The transformation function did not reduce the error, removing rotation and retying (old_error=282.06759476, new_error=290.291758797).
2018-04-26 15:04:11 DESKTOP-LKC15NF root[48320] WARNING ['110'] : The transformation function did not reduce the error, removing transform (old_error=282.06759476, new_error=283.847706138).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['112'] : The transformation function did not reduce the error, removing rotation and retying (old_error=122.99870701, new_error=124.654102917).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['112'] : The transformation function did not reduce the error, removing rotation and retying (old_error=246.358790529, new_error=281.988374053).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] WARNING ['112'] : The transformation function did not reduce the error, removing transform (old_error=246.358790529, new_error=249.084718777).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['112'] : The transformation function did not reduce the error, removing rotation and retying (old_error=105.034425815, new_error=114.798438931).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['113'] : The transformation function did not reduce the error, removing rotation and retying (old_error=164.509784024, new_error=177.81117708).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] WARNING ['113'] : The transformation function did not reduce the error, removing transform (old_error=164.509784024, new_error=168.549784387).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['113'] : The transformation function did not reduce the error, removing rotation and retying (old_error=228.080173129, new_error=233.172749164).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['113'] : The transformation function did not reduce the error, removing rotation and retying (old_error=147.496475163, new_error=164.128287492).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] WARNING ['113'] : The transformation function did not reduce the error, removing transform (old_error=147.496475163, new_error=148.343285181).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['114'] : The transformation function did not reduce the error, removing rotation and retying (old_error=162.79017008, new_error=173.579246648).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['114'] : The transformation function did not reduce the error, removing rotation and retying (old_error=137.078744861, new_error=139.120774558).
2018-04-26 15:04:12 DESKTOP-LKC15NF root[48320] INFO ['114'] : The transformation function did not reduce the error, removing rotation and retying (old_error=154.890437266, new_error=158.108218368).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['115'] : The transformation function did not reduce the error, removing rotation and retying (old_error=68.9935361488, new_error=77.1944938092).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['115'] : The transformation function did not reduce the error, removing rotation and retying (old_error=128.946527598, new_error=201.141031084).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] WARNING ['115'] : The transformation function did not reduce the error, removing transform (old_error=128.946527598, new_error=143.731899042).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['115'] : The transformation function did not reduce the error, removing rotation and retying (old_error=73.4040317974, new_error=151.511808323).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] WARNING ['115'] : The transformation function did not reduce the error, removing transform (old_error=73.4040317974, new_error=90.8534586635).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['116'] : The transformation function did not reduce the error, removing rotation and retying (old_error=96.0347355361, new_error=133.815758309).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] WARNING ['116'] : The transformation function did not reduce the error, removing transform (old_error=96.0347355361, new_error=102.230080675).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['116'] : The transformation function did not reduce the error, removing rotation and retying (old_error=105.028227623, new_error=115.096364231).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] WARNING ['116'] : The transformation function did not reduce the error, removing transform (old_error=105.028227623, new_error=106.86021059).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['116'] : The transformation function did not reduce the error, removing rotation and retying (old_error=51.5859599803, new_error=86.0076217006).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] WARNING ['116'] : The transformation function did not reduce the error, removing transform (old_error=51.5859599803, new_error=58.4119628373).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] INFO ['116'] : The transformation function did not reduce the error, removing rotation and retying (old_error=59.8471804056, new_error=139.868812551).
2018-04-26 15:04:13 DESKTOP-LKC15NF root[48320] WARNING ['116'] : The transformation function did not reduce the error, removing transform (old_error=59.8471804056, new_error=78.8959577892).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['123'] : The transformation function did not reduce the error, removing rotation and retying (old_error=190.800908933, new_error=195.738642363).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['123'] : The transformation function did not reduce the error, removing rotation and retying (old_error=116.199653518, new_error=118.561530398).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] WARNING ['123'] : The transformation function did not reduce the error, removing transform (old_error=116.199653518, new_error=117.422389783).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['124'] : The transformation function did not reduce the error, removing rotation and retying (old_error=183.650839739, new_error=286.358948133).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] WARNING ['124'] : The transformation function did not reduce the error, removing transform (old_error=183.650839739, new_error=205.362122996).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['124'] : The transformation function did not reduce the error, removing rotation and retying (old_error=154.227900894, new_error=160.85707982).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['124'] : The transformation function did not reduce the error, removing rotation and retying (old_error=136.523630545, new_error=142.636183752).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['125'] : The transformation function did not reduce the error, removing rotation and retying (old_error=110.83095415, new_error=148.51347702).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] WARNING ['125'] : The transformation function did not reduce the error, removing transform (old_error=110.83095415, new_error=117.464426419).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['125'] : The transformation function did not reduce the error, removing rotation and retying (old_error=144.57356278, new_error=182.568121842).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] WARNING ['125'] : The transformation function did not reduce the error, removing transform (old_error=144.57356278, new_error=150.550593632).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['125'] : The transformation function did not reduce the error, removing rotation and retying (old_error=69.1791949631, new_error=99.4875203453).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] WARNING ['125'] : The transformation function did not reduce the error, removing transform (old_error=69.1791949631, new_error=76.3319273093).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] INFO ['125'] : The transformation function did not reduce the error, removing rotation and retying (old_error=45.2241468833, new_error=49.724044386).
2018-04-26 15:04:14 DESKTOP-LKC15NF root[48320] WARNING ['125'] : The transformation function did not reduce the error, removing transform (old_error=45.2241468833, new_error=45.7769766683).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['127'] : The transformation function did not reduce the error, removing rotation and retying (old_error=331.107859606, new_error=402.967494495).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['127'] : The transformation function did not reduce the error, removing transform (old_error=331.107859606, new_error=393.716923322).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['127'] : The transformation function did not reduce the error, removing rotation and retying (old_error=318.626172582, new_error=432.720038346).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['127'] : The transformation function did not reduce the error, removing transform (old_error=318.626172582, new_error=417.393097483).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['127'] : The transformation function did not reduce the error, removing rotation and retying (old_error=383.038243959, new_error=1231.36926862).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['127'] : The transformation function did not reduce the error, removing transform (old_error=383.038243959, new_error=1226.48280878).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['127'] : The transformation function did not reduce the error, removing rotation and retying (old_error=138.282028617, new_error=156.717562157).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['127'] : The transformation function did not reduce the error, removing transform (old_error=138.282028617, new_error=138.977080433).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['128'] : The transformation function did not reduce the error, removing rotation and retying (old_error=106.177066557, new_error=117.147457358).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['128'] : The transformation function did not reduce the error, removing transform (old_error=106.177066557, new_error=106.387839861).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['128'] : The transformation function did not reduce the error, removing rotation and retying (old_error=182.227823902, new_error=225.974334987).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['128'] : The transformation function did not reduce the error, removing transform (old_error=182.227823902, new_error=194.910102333).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['128'] : The transformation function did not reduce the error, removing rotation and retying (old_error=54.5457923911, new_error=109.453520289).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['128'] : The transformation function did not reduce the error, removing transform (old_error=54.5457923911, new_error=68.0175864797).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] INFO ['128'] : The transformation function did not reduce the error, removing rotation and retying (old_error=49.2402444853, new_error=106.406929079).
2018-04-26 15:04:15 DESKTOP-LKC15NF root[48320] WARNING ['128'] : The transformation function did not reduce the error, removing transform (old_error=49.2402444853, new_error=63.1633007825).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['129'] : The transformation function did not reduce the error, removing rotation and retying (old_error=142.176875827, new_error=221.588161056).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['129'] : The transformation function did not reduce the error, removing transform (old_error=142.176875827, new_error=163.520212471).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['129'] : The transformation function did not reduce the error, removing rotation and retying (old_error=174.241620967, new_error=199.381145157).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['129'] : The transformation function did not reduce the error, removing transform (old_error=174.241620967, new_error=174.810332031).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['129'] : The transformation function did not reduce the error, removing rotation and retying (old_error=128.1752076, new_error=226.646020377).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['129'] : The transformation function did not reduce the error, removing transform (old_error=128.1752076, new_error=149.233218137).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['129'] : The transformation function did not reduce the error, removing rotation and retying (old_error=94.7558404798, new_error=113.461250709).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['129'] : The transformation function did not reduce the error, removing transform (old_error=94.7558404798, new_error=95.3639707034).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['131'] : The transformation function did not reduce the error, removing rotation and retying (old_error=164.060225589, new_error=250.023228437).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['131'] : The transformation function did not reduce the error, removing transform (old_error=164.060225589, new_error=182.893208181).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['131'] : The transformation function did not reduce the error, removing rotation and retying (old_error=150.300820493, new_error=216.145935456).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['131'] : The transformation function did not reduce the error, removing transform (old_error=150.300820493, new_error=152.121521028).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['131'] : The transformation function did not reduce the error, removing rotation and retying (old_error=60.470357891, new_error=121.766700869).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] WARNING ['131'] : The transformation function did not reduce the error, removing transform (old_error=60.470357891, new_error=74.9443876369).
2018-04-26 15:04:16 DESKTOP-LKC15NF root[48320] INFO ['131'] : The transformation function did not reduce the error, removing rotation and retying (old_error=45.228702946, new_error=55.1969896581).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['134'] : The transformation function did not reduce the error, removing rotation and retying (old_error=265.443836538, new_error=273.82221852).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['134'] : The transformation function did not reduce the error, removing transform (old_error=265.443836538, new_error=273.879376384).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['134'] : The transformation function did not reduce the error, removing rotation and retying (old_error=137.540954836, new_error=160.732810394).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['134'] : The transformation function did not reduce the error, removing transform (old_error=137.540954836, new_error=141.855384896).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['134'] : The transformation function did not reduce the error, removing rotation and retying (old_error=110.195257756, new_error=139.597645401).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['134'] : The transformation function did not reduce the error, removing transform (old_error=110.195257756, new_error=117.81178372).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['135'] : The transformation function did not reduce the error, removing rotation and retying (old_error=212.593475584, new_error=214.648696545).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['135'] : The transformation function did not reduce the error, removing transform (old_error=212.593475584, new_error=216.79302254).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['135'] : The transformation function did not reduce the error, removing rotation and retying (old_error=76.4756776824, new_error=114.754289661).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['135'] : The transformation function did not reduce the error, removing transform (old_error=76.4756776824, new_error=81.6411640466).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['135'] : The transformation function did not reduce the error, removing rotation and retying (old_error=120.365805373, new_error=177.671697389).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['135'] : The transformation function did not reduce the error, removing transform (old_error=120.365805373, new_error=126.00328633).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['145'] : The transformation function did not reduce the error, removing rotation and retying (old_error=143.517221468, new_error=197.204899151).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] WARNING ['145'] : The transformation function did not reduce the error, removing transform (old_error=143.517221468, new_error=149.583954019).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['145'] : The transformation function did not reduce the error, removing rotation and retying (old_error=150.041533084, new_error=162.033322925).
2018-04-26 15:04:17 DESKTOP-LKC15NF root[48320] INFO ['145'] : The transformation function did not reduce the error, removing rotation and retying (old_error=167.373889205, new_error=168.684664536).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['146'] : The transformation function did not reduce the error, removing rotation and retying (old_error=142.985515973, new_error=147.188661699).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] WARNING ['146'] : The transformation function did not reduce the error, removing transform (old_error=142.985515973, new_error=144.245538355).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['146'] : The transformation function did not reduce the error, removing rotation and retying (old_error=158.713644734, new_error=203.44064574).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] WARNING ['146'] : The transformation function did not reduce the error, removing transform (old_error=158.713644734, new_error=167.701072297).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['146'] : The transformation function did not reduce the error, removing rotation and retying (old_error=130.520553161, new_error=161.508504044).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] WARNING ['146'] : The transformation function did not reduce the error, removing transform (old_error=130.520553161, new_error=132.989832282).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['147'] : The transformation function did not reduce the error, removing rotation and retying (old_error=131.285313536, new_error=138.616428255).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] WARNING ['147'] : The transformation function did not reduce the error, removing transform (old_error=131.285313536, new_error=133.235979429).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['147'] : The transformation function did not reduce the error, removing rotation and retying (old_error=89.2508899386, new_error=96.352389857).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['148'] : The transformation function did not reduce the error, removing rotation and retying (old_error=157.241112638, new_error=166.0823058).
2018-04-26 15:04:18 DESKTOP-LKC15NF root[48320] INFO ['148'] : The transformation function did not reduce the error, removing rotation and retying (old_error=361.169521788, new_error=607.117629982).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] WARNING ['148'] : The transformation function did not reduce the error, removing transform (old_error=361.169521788, new_error=604.738211175).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] INFO ['149'] : The transformation function did not reduce the error, removing rotation and retying (old_error=159.265059416, new_error=166.785622117).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] INFO ['149'] : The transformation function did not reduce the error, removing rotation and retying (old_error=155.203301573, new_error=158.232258531).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] WARNING ['149'] : The transformation function did not reduce the error, removing transform (old_error=155.203301573, new_error=158.164274181).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] INFO ['149'] : The transformation function did not reduce the error, removing rotation and retying (old_error=128.557121933, new_error=132.179731725).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] WARNING ['149'] : The transformation function did not reduce the error, removing transform (old_error=128.557121933, new_error=129.06443307).
2018-04-26 15:04:19 DESKTOP-LKC15NF root[48320] INFO Done processing all files. Data can be found in C:\Users\Kevin\Documents\GitHub\msl-iposition-pipeline\examples\saved_data\4-room-intermediate-files\Holodeck 4-Room Spatial Navigation.csv.

Visualization Helper Function

These functions will be used to help us streamline later analyses. You need to install R to use the R features.


In [6]:
import pandas
import numpy as np
import matplotlib.pyplot as plt

import rpy2.robjects.packages as rpackages
from rpy2.robjects.vectors import StrVector
from rpy2.robjects import r, pandas2ri
import rpy2.robjects

pandas2ri.activate()
utils = rpackages.importr('utils')
utils.chooseCRANmirror(ind=1)
packnames = ['afex']
names_to_install = [x for x in packnames if not rpackages.isinstalled(x)]
if len(names_to_install) > 0:
    utils.install_packages(StrVector(names_to_install))
afex = rpackages.importr('afex')

def rANOVA(_data, column, significance_level=0.05, verbose=False):
    r_data = pandas2ri.py2ri(_data) # Convert the data
    ez = r.aov_ez('subID', column, r_data, within='trial') # Run the anova
    p_value = np.array(np.array(ez)[0])[-1][0] # Store the p-value
    print('_'*50)
    if p_value < significance_level: # Check for significance
        print("Significant (p={0}) change in {1} overall.\r\n".format(p_value, column))
        em = r.emmeans(ez, r.formula('~trial')) # Calculate the trial statistics
        forward_difference_anova_result = r.pairs(em) # Generate the Tukey corrected pairwise comparisons
        forward_difference_anova_summary = r.summary(forward_difference_anova_result) # Summarize the results
        adjacent_p_values = np.array(forward_difference_anova_summary)[5][[True, False, False, True, False, True]]
        for p, l in zip(adjacent_p_values, ['First vs. Second', 'Second vs. Third', 'Third vs. Fourth']):
            if p < significance_level:
                print("Significant (p={0}) change in {1}.".format(p, l))
            else:
                print("No Significant (p={0}) change in {1}.".format(p, l))
        if verbose:
            print(ez) # Print the basic anova result
            print(forward_difference_anova_summary) # Print the pairwise comparisons
    else:
        print("No Significant (p={0}) change in {1} overall.".format(p_value, column))
    print('_'*50)

In [7]:
import pandas
import numpy as np
import matplotlib.pyplot as plt

def visualize_columns(data, column_names, titles, ylabels, fig_size=(15, 5), separate_plots=True, legend_labels=None, subplot_shape=None, verbose_anova=False):
    # Extract the columns of interest
    trial_num = data['trial']
    columns = [data[c_name] for c_name in column_names]
    
    # Generate useful constants
    trials = list(set(trial_num))
    num_items = 10
    num_participants = len(trial_num)
    means = [[np.mean(column[trial_num == i]) for i in trials] for column in columns]
    std_errs = [[np.std(column[trial_num == i])/np.sqrt(num_participants) for i in trials] for column in columns]
    
    # Visualize each trial-over-trial mean in a subplot
    if separate_plots:
        if subplot_shape is None:
            f, axarr = plt.subplots(1, len(column_names))
        else:
            f, axarr = plt.subplots(*subplot_shape)
            axarr = [j for i in axarr for j in i]
        if len(column_names) == 1:
            axarr = [axarr]
        f.set_size_inches(fig_size)
        for ax, title, mean, std_err, ylabel in zip(axarr, titles, means, std_errs, ylabels):
            ax.errorbar(trials, mean, std_err)
            ax.set_title(title)
            ax.grid(True)
            ax.set_xlabel('Trials')
            ax.set_ylabel(ylabel)
            ax.xaxis.set_ticks(trials)
        plt.show()
    else:
        f = plt.figure()
        f.set_size_inches(fig_size)
        for idx, (title, mean, std_err, ylabel) in enumerate(zip(titles, means, std_errs, ylabels)):
            label = ''
            if legend_labels is not None:
                label = legend_labels[idx]
            plt.errorbar(trials, mean, std_err, label=label)
            plt.title(title)
            plt.grid(True)
            plt.xlabel('Trials')
            plt.ylabel(ylabel)
            plt.gca().xaxis.set_ticks(trials)
            plt.legend()
        plt.show()
    
    for column in column_names:
        rANOVA(data, column.replace(' ', '.'), verbose=verbose_anova)

Misplacement and Misassignment in Space

In this section, we'll look at the basic Misplacement and Misassignment metrics.


In [8]:
import pandas as pd

# Load the data
data = pd.read_csv(test_cogrecon_filename, skiprows=[0])

columns =  ['Original Misplacement']
titles = ['Misplacement']
ylabels = ['Misplacement (meters)']
visualize_columns(data, columns, titles, ylabels)


__________________________________________________
Significant (p=4.16237885805e-17) change in Original.Misplacement overall.

Significant (p=5.31689260397e-08) change in First vs. Second.
Significant (p=0.00405846070848) change in Second vs. Third.
No Significant (p=0.137262396593) change in Third vs. Fourth.
__________________________________________________

In [9]:
import pandas as pd

# Load the data
data = pd.read_csv(test_cogrecon_filename, skiprows=1)

visualize_columns(data, ['Accurate Misassignment'], ['Misassignment'], ['Number of Items'], verbose_anova=True)


__________________________________________________
Significant (p=3.14044158211e-08) change in Accurate.Misassignment overall.

Significant (p=0.00656836891279) change in First vs. Second.
No Significant (p=0.233565665432) change in Second vs. Third.
No Significant (p=0.665879553244) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: Accurate.Misassignment

  Effect           df  MSE         F ges p.value

1  trial 2.81, 121.04 4.11 15.42 *** .08  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate        SE  df t.ratio p.value

 X0 - X1  1.3863636 0.4188695 129   3.310  0.0066

 X0 - X2  2.1818182 0.4188695 129   5.209  <.0001

 X0 - X3  2.6590909 0.4188695 129   6.348  <.0001

 X1 - X2  0.7954545 0.4188695 129   1.899  0.2336

 X1 - X3  1.2727273 0.4188695 129   3.038  0.0151

 X2 - X3  0.4772727 0.4188695 129   1.139  0.6659



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________

Context Boundary Effects (in Space)

In this section, we'll read the segmentation file and generate our more traditional measures of context boundary effects. After they are generated, we will visualize it, and, if specified, save it to an intermediate file for later use.


In [10]:
import pandas as pd

# Load the data
data = pd.read_csv(segmentation_filename)

# Rename to get the naming standards to be correct for the processing
old_column_names = list(data.columns)
old_column_names[0] = 'subID'
old_column_names[1] = 'trial'
data.columns = old_column_names

# Extract the mean normed distances and build a simple summary table
data_summary = pd.DataFrame()
grouped_data = data.groupby(['subID', 'trial', 'context_crossing_class'])
tmp_dist = np.array(grouped_data['normed_distance'].mean())
tmp_sub = np.array(grouped_data['subID'].mean())
tmp_trial = np.array(grouped_data['trial'].mean())
data_summary['subID'] = tmp_sub[0::2]
data_summary['trial'] = tmp_trial[0::2]
data_summary['context_crossing_dist_no_exclusions'] = tmp_dist[0::2]
data_summary['context_noncrossing_dist_no_exclusions'] = tmp_dist[1::2]

tmp = []
for index, row in data.iterrows():
    if not row['first_item_expected_color_room'] == row['first_item_actual_color_room'] or not row['second_item_expected_color_room'] == row['second_item_actual_color_room']:
        tmp.append(np.nan)
    else:
        tmp.append(row['normed_distance'])
data['normed_distance'] = tmp

grouped_data = data.groupby(['subID', 'trial', 'context_crossing_class'])
tmp_dist = np.array(grouped_data['normed_distance'].sum())
tmp_sub = np.array(grouped_data['subID'].mean())
tmp_trial = np.array(grouped_data['trial'].mean())
data_summary['context_crossing_dist_exclude_wrong_color_pairs'] = tmp_dist[0::2]
data_summary['context_noncrossing_dist_exclude_wrong_color_pairs'] = tmp_dist[1::2]
        
# Visualize
columns = ['context_crossing_dist_no_exclusions', 'context_noncrossing_dist_no_exclusions']
titles = ['', 'Within vs. Across (no exclusions)']
legend_labels = ['Across', 'Within']
ylabels = ['Normalized Distance (1 is perfect)', 'Normalized Distance (1 is perfect)']

visualize_columns(data_summary, columns, titles, ylabels, separate_plots=False, legend_labels=legend_labels)

columns = ['context_crossing_dist_exclude_wrong_color_pairs', 'context_noncrossing_dist_exclude_wrong_color_pairs']
titles = ['', 'Within vs. Across (excluding wrong color)']
legend_labels = ['Across', 'Within']
ylabels = ['Normalized Distance (1 is perfect)', 'Normalized Distance (1 is perfect)']

visualize_columns(data_summary, columns, titles, ylabels, separate_plots=False, legend_labels=legend_labels)

if 'generate_intermediate_files' in vars() and generate_intermediate_files:
    data_summary.to_csv(cbe_intermediate_filename)


__________________________________________________
Significant (p=0.000674331444256) change in context_crossing_dist_no_exclusions overall.

No Significant (p=0.776525373361) change in First vs. Second.
No Significant (p=0.098047164818) change in Second vs. Third.
No Significant (p=0.981129575628) change in Third vs. Fourth.
__________________________________________________
__________________________________________________
Significant (p=0.0204552423229) change in context_noncrossing_dist_no_exclusions overall.

No Significant (p=0.0607509653576) change in First vs. Second.
No Significant (p=0.99471842726) change in Second vs. Third.
No Significant (p=0.999523622752) change in Third vs. Fourth.
__________________________________________________
__________________________________________________
Significant (p=0.0159550011142) change in context_crossing_dist_exclude_wrong_color_pairs overall.

No Significant (p=0.706751321029) change in First vs. Second.
No Significant (p=0.486956591932) change in Second vs. Third.
No Significant (p=0.963238697263) change in Third vs. Fourth.
__________________________________________________
__________________________________________________
Significant (p=4.25753730826e-14) change in context_noncrossing_dist_exclude_wrong_color_pairs overall.

Significant (p=8.45970693497e-09) change in First vs. Second.
No Significant (p=0.385659473161) change in Second vs. Third.
No Significant (p=0.313553037714) change in Third vs. Fourth.
__________________________________________________

Effect of Context on Misassignment

The analysis of the effect of context on misassignment is not included in any of the packages directly. As a result, we need to do some custom computation to get out these numbers.

First, we need to read the data from the Accurate Misassignment Pairs column in the temporal only test output file. Because we're running this in one file, we can simply extract this from the output result rather than reading from file.

Then, we'll process the list, counting the number of within vs. across context pairs.


In [11]:
import itertools
import ast
import pandas as pd
import numpy as np
from cogrecon.core.batch_pipeline import get_header_labels

# Load the data
data = pandas.read_csv(test_cogrecon_filename, skiprows=1)
misassignment_pairs = [ast.literal_eval(row) for row in data['Accurate Misassignment Pairs']]

# The pairs which share a context (note that order doesn't matter for this)
# These are tuples, but in this analysis, we don't use tuples because then chance level is more complicated (plus they aren't related to the question at hand)
# within_key = [[1, 2], [2, 1], [5, 6], [6, 5], [13, 15], [15, 13], [11, 7], [7, 11]]
# across_key = [[7, 8], [8, 7], [15, 14], [14, 15], [6, 0], [0, 6], [2, 4], [4, 2]]

context_item_indicies = [[4, 7, 11, 12], [8, 9, 13, 15], [0, 1, 2, 10], [3, 5, 6, 14]]
within_key = []
for context in context_item_indicies:
    keys = list(itertools.product(context, context))
    for key in keys:
        if key[0] != key[1]:
            within_key.append(list(key))

all_keys = []
for key in list(itertools.product(list(range(0, 16)), list(range(0, 16)))):
    if key[0] != key[1]:
        all_keys.append(list(key))

across_key = [ast.literal_eval(el) for el in list(set([str(x) for x in all_keys]) - set([str(x) for x in within_key]))]

# The items to exclude because they had no contextual information
# thus if they were given temporal information, they would not be a valid misassignment
exclusion_items = []

within_list = []
across_list = []
totals_list = []
for i, a in enumerate(misassignment_pairs):
    totals_list.append(len(a))
    within_list.append(0)
    across_list.append(0)
    for el in a:
        if all([el_i not in exclusion_items for el_i in el]):
            if el in within_key:
                within_list[-1] += 1
            elif el in across_key:
                across_list[-1] += 1
within_list_proportion = [float(x)/float(y) if y is not 0 else np.nan for x, y in zip(within_list, totals_list)]
across_list_proportion = [float(x)/float(y) if y is not 0 else np.nan for x, y in zip(across_list, totals_list)]

Now, we can visualize the information we'd like.


In [12]:
import pandas as pd

data = pd.read_csv(test_cogrecon_filename, skiprows=1)

num_elements = len(within_list_proportion)

chance = float(len(within_key)) / float(len(within_key) + len(across_key))

data['within_list_proportion'] = pandas.Series(within_list_proportion)
data['across_list_proportion'] = pandas.Series(across_list_proportion)
data['within'] = pandas.Series(within_list)
data['across'] = pandas.Series(across_list)
data['top'] = pandas.Series([chance]*num_elements)
data['bottom'] = pandas.Series([(1.-chance)]*num_elements)

columns_proportion = ['top', 'bottom', 'within_list_proportion', 'across_list_proportion']
titles_proportion = ['', '', '', 'Within vs. Across Proportion']
legend_labels_proportion = ["Across Chance", "Within Chance", "Within", "Across"]
ylabels_proportion = ['', '', '', 'Proportion of Misassignments']

columns = ['within', 'across']
titles = ['', 'Within vs. Across']
legend_labels = ["Within", "Across"]
ylabels = ['', 'Number of Misassigned Items']

visualize_columns(data, columns_proportion, titles_proportion, ylabels_proportion, separate_plots=False, legend_labels=legend_labels_proportion, verbose_anova=True)
visualize_columns(data, columns, titles, ylabels, separate_plots=False, legend_labels=legend_labels, verbose_anova=True)


__________________________________________________
No Significant (p=nan) change in top overall.
__________________________________________________
__________________________________________________
No Significant (p=nan) change in bottom overall.
__________________________________________________
__________________________________________________
Significant (p=0.00017775078458) change in within_list_proportion overall.

No Significant (p=0.166396760125) change in First vs. Second.
No Significant (p=0.189600821246) change in Second vs. Third.
No Significant (p=0.993437169852) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: within_list_proportion

  Effect          df  MSE        F ges p.value

1  trial 2.74, 54.89 0.05 8.34 *** .15   .0002

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast    estimate         SE df t.ratio p.value

 X0 - X1  -0.14147427 0.06752171 60  -2.095  0.1664

 X0 - X2  -0.27834982 0.06752171 60  -4.122  0.0007

 X0 - X3  -0.29622415 0.06752171 60  -4.387  0.0003

 X1 - X2  -0.13687556 0.06752171 60  -2.027  0.1896

 X1 - X3  -0.15474988 0.06752171 60  -2.292  0.1114

 X2 - X3  -0.01787432 0.06752171 60  -0.265  0.9934



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.00017775078458) change in across_list_proportion overall.

No Significant (p=0.166396760125) change in First vs. Second.
No Significant (p=0.189600821246) change in Second vs. Third.
No Significant (p=0.993437169852) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: across_list_proportion

  Effect          df  MSE        F ges p.value

1  trial 2.74, 54.89 0.05 8.34 *** .15   .0002

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate         SE df t.ratio p.value

 X0 - X1  0.14147427 0.06752171 60   2.095  0.1664

 X0 - X2  0.27834982 0.06752171 60   4.122  0.0007

 X0 - X3  0.29622415 0.06752171 60   4.387  0.0003

 X1 - X2  0.13687556 0.06752171 60   2.027  0.1896

 X1 - X3  0.15474988 0.06752171 60   2.292  0.1114

 X2 - X3  0.01787432 0.06752171 60   0.265  0.9934



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
No Significant (p=0.45260164812) change in within overall.
__________________________________________________
__________________________________________________
Significant (p=4.43874827967e-09) change in across overall.

Significant (p=0.00635637008876) change in First vs. Second.
Significant (p=0.0215495616098) change in Second vs. Third.
No Significant (p=0.849748488753) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: across

  Effect           df  MSE         F ges p.value

1  trial 2.38, 102.13 2.19 20.43 *** .15  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate        SE  df t.ratio p.value

 X0 - X1  0.9318182 0.2806611 129   3.320  0.0064

 X0 - X2  1.7500000 0.2806611 129   6.235  <.0001

 X0 - X3  1.9772727 0.2806611 129   7.045  <.0001

 X1 - X2  0.8181818 0.2806611 129   2.915  0.0215

 X1 - X3  1.0454545 0.2806611 129   3.725  0.0016

 X2 - X3  0.2272727 0.2806611 129   0.810  0.8497



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________

Next we can save these results to an intermediate file for running statistics.


In [13]:
if 'generate_intermediate_files' in vars() and generate_intermediate_files:
    with open(misassignment_intermediate_filename, 'w') as fp:
        fp.write('subID,trial,total_misassignments,within_misassignments,across_misassignments,within_misassignment_proportions,across_misassignment_proportions\n')
        for sid, tr, t, w, a, wp, ap in zip(data['subID'], data['trial'], totals_list,within_list ,across_list,within_list_proportion,across_list_proportion):
            fp.write('{0},{1},{2},{3},{4},{5},{6}\n'.format(sid, tr, t, w, a, wp, ap))

Next, we can create the summary file for navigation metrics. The Data Conversion section created an intermediate file called study_path.csv stored in ./{output_directory} that we will generate the appropriate data from. Note that this functionality is not included in the cogrecon package directly for the 4-room data format, but we can copy much of what is done in cogrecon.core.data_flexing.time_travel_task.time_travel_task_analytics.summarize_navigation_data.


In [14]:
data = pd.read_csv(os.path.join('.', output_directory, 'study_path.csv'))

In [15]:
grp = data.groupby(['subject_id', 'trial_number'])

In [16]:
import matplotlib.pyplot as plt
for name, group in grp:
    x, y = np.transpose(np.array(group[['x', 'z']]))
    if any([xx < -25 for xx in x]):
        print('{0} is weird.'.format(name))
    plt.plot(x, y)
plt.show()


(103, 0) is weird.

In [17]:
import tqdm
import pandas as pd
import numpy as np
import scipy.spatial.distance as distance
from cogrecon.core.data_flexing.time_travel_task.time_travel_task_analytics import calculate_fd_and_lacunarity

def summarize_navigation_data_4room(intermediate_filename,
                                    output_path, 
                                    output_path_contexts,
                                    verbose=True,
                                    fd_indicies_time=None,
                                    fd_indicies_space=None,
                                    fd_indicies_spacetime=None):
    fp = open(output_path, 'wb')
    fp_contexts = open(output_path_contexts, 'wb')
    
    header = (
        'subID,trial,context_order,total_time,total_distance,fd_space,lacunarity_space'
    )

    header_contexts = (
        'subID,trial,context,total_time,total_distance,fd_space,lacunarity_space'
    )
    
    fp.write(header + '\r\n')
    fp_contexts.write(header_contexts + '\r\n')

    data = pd.read_csv(intermediate_filename)
    
    grp = data.groupby(['subject_id', 'trial_number'])
    grp_contexts = data.groupby(['subject_id', 'trial_number', 'room_by_color'])
    
    t = tqdm.tqdm(grp_contexts)
    for name, group in t:
        subID = name[0]
        trial = name[1]
        context = name[2]

        timeline = group['time']
        total_time = max(timeline) - min(timeline)
        
        spaceline = np.transpose(np.array([list(group['x']), list(group['z'])]))
        space_travelled = sum([distance.euclidean(spaceline[idx-1], spaceline[idx]) for idx in range(1, len(spaceline))])
        fd_s, lac_s = calculate_fd_and_lacunarity(spaceline, indicies=fd_indicies_space)

        line = ','.join([str(subID), str(trial), str(context), str(total_time), str(space_travelled), str(fd_s), str(lac_s)])

        if verbose:
            t.set_description(line)

        fp_contexts.write(line + '\r\n')
        fp_contexts.flush()
        
    t = tqdm.tqdm(grp)
    for name, group in t:
        subID = name[0]
        trial = name[1]

        context_order = np.array(group['room_by_color'].unique())
        
        timeline = group['time']
        total_time = max(timeline) - min(timeline)
        
        spaceline = np.transpose(np.array([list(group['x']), list(group['z'])]))
        space_travelled = sum([distance.euclidean(spaceline[idx-1], spaceline[idx]) for idx in range(1, len(spaceline))])
        fd_s, lac_s = calculate_fd_and_lacunarity(spaceline, indicies=fd_indicies_space)

        line = ','.join([str(subID), str(trial), str('-'.join(context_order)), str(total_time), str(space_travelled), str(fd_s), str(lac_s)])

        if verbose:
            t.set_description(line)

        fp.write(line + '\r\n')
        fp.flush()

    fp_contexts.close()
    fp.close()

    return True
Note: The next cell can take a significant amount of time to run. If the intermediate data file (*time_travel_task_navigation_summary.csv* by default) already exists, consider skipping the next cell.

To run the cell, set generate_intermediate_files to True (in [Data Directories](#Data-Directories)).

See the FD_Lacunarity.ipynb for details on how fd_indicies were determined. They can be calculated for individuals by setting fd_indicies_domain=None. Otherwise a list of scale indicies is expected (in this case, deteremined by the average window across participants).


In [18]:
import os

if 'generate_intermediate_files' in vars() and generate_intermediate_files:
    summarize_navigation_data_4room(os.path.join('.', output_directory, 'study_path.csv'),
                                    nav_intermediate_filename, nav_context_intermediate_filename, verbose=True,
                              fd_indicies_space=[6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16])

In [19]:
import pandas

# Load the data
data_nav = pandas.read_csv(nav_intermediate_filename)

columns = ['total_time', 'total_distance', 'fd_space', 'lacunarity_space']
titles = ['Total Time', 'Spatial Distance', 'Spatial FD', 'Spatial Lacunarity']
ylabels = ['Time (seconds)', 'Distance Travelled (meters)', '', '']

visualize_columns(data_nav, columns, titles, ylabels, separate_plots=True, subplot_shape=[2, 2], fig_size=(15, 15), verbose_anova=True)


__________________________________________________
Significant (p=3.24930136326e-07) change in total_time overall.

No Significant (p=0.357777754107) change in First vs. Second.
No Significant (p=0.224861599835) change in Second vs. Third.
Significant (p=0.022232790999) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_time

  Effect           df                   MSE         F ges p.value

1  trial 2.35, 100.90 461307377270371328.00 15.44 *** .17  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate        SE  df t.ratio p.value

 X0 - X1  -210547156 128064280 129  -1.644  0.3578

 X0 - X2  -456409001 128064280 129  -3.564  0.0029

 X0 - X3  -828330648 128064280 129  -6.468  <.0001

 X1 - X2  -245861846 128064280 129  -1.920  0.2249

 X1 - X3  -617783492 128064280 129  -4.824  <.0001

 X2 - X3  -371921647 128064280 129  -2.904  0.0222



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=2.09123013731e-10) change in total_distance overall.

Significant (p=1.55216943243e-06) change in First vs. Second.
No Significant (p=0.229756774636) change in Second vs. Third.
No Significant (p=0.693656499993) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_distance

  Effect          df      MSE         F ges p.value

1  trial 2.07, 89.14 20661.42 28.10 *** .28  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1  138.51971 25.47444 129   5.438  <.0001

 X0 - X2  187.12698 25.47444 129   7.346  <.0001

 X0 - X3  215.00350 25.47444 129   8.440  <.0001

 X1 - X2   48.60727 25.47444 129   1.908  0.2298

 X1 - X3   76.48379 25.47444 129   3.002  0.0168

 X2 - X3   27.87652 25.47444 129   1.094  0.6937



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=7.99678170216e-13) change in fd_space overall.

Significant (p=5.97376467272e-05) change in First vs. Second.
No Significant (p=0.0708859803321) change in Second vs. Third.
No Significant (p=0.169167380778) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: fd_space

  Effect           df  MSE         F ges p.value

1  trial 2.47, 106.10 0.00 30.84 *** .27  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate          SE  df t.ratio p.value

 X0 - X1  0.03333699 0.007255639 129   4.595  0.0001

 X0 - X2  0.05120043 0.007255639 129   7.057  <.0001

 X0 - X3  0.06620531 0.007255639 129   9.125  <.0001

 X1 - X2  0.01786344 0.007255639 129   2.462  0.0709

 X1 - X3  0.03286832 0.007255639 129   4.530  0.0001

 X2 - X3  0.01500488 0.007255639 129   2.068  0.1692



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=4.1497366866e-11) change in lacunarity_space overall.

Significant (p=1.00727002506e-06) change in First vs. Second.
No Significant (p=0.345443982746) change in Second vs. Third.
No Significant (p=0.669286906663) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: lacunarity_space

  Effect          df  MSE         F ges p.value

1  trial 2.31, 99.48 0.05 27.30 *** .26  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate         SE  df t.ratio p.value

 X0 - X1  0.23052997 0.04167135 129   5.532  <.0001

 X0 - X2  0.29998992 0.04167135 129   7.199  <.0001

 X0 - X3  0.34724256 0.04167135 129   8.333  <.0001

 X1 - X2  0.06945994 0.04167135 129   1.667  0.3454

 X1 - X3  0.11671258 0.04167135 129   2.801  0.0296

 X2 - X3  0.04725264 0.04167135 129   1.134  0.6693



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________

In [20]:
import pandas

# Load the data
data_nav_contexts = pandas.read_csv(nav_context_intermediate_filename)

grps = data_nav_contexts.groupby('context')
for name, group in grps:
    columns = ['total_time', 'total_distance', 'fd_space', 'lacunarity_space']
    titles = [x + ' ' + name.title() for x in ['Total Time', 'Spatial Distance', 'Spatial FD', 'Spatial Lacunarity']]
    ylabels = ['Time (seconds)', 'Distance Travelled (meters)', '', '']

    visualize_columns(group, columns, titles, ylabels, separate_plots=True, subplot_shape=[2, 2], fig_size=(15, 15), verbose_anova=True)


__________________________________________________
Significant (p=3.39598497944e-05) change in total_time overall.

Significant (p=0.0424551256409) change in First vs. Second.
No Significant (p=0.0802959330047) change in Second vs. Third.
No Significant (p=0.998037183112) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_time

  Effect          df                   MSE         F ges p.value

1  trial 2.08, 87.41 181550026771830496.00 11.29 *** .17  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1  204143591 76539384 126   2.667  0.0425

 X0 - X2  388629542 76539384 126   5.078  <.0001

 X0 - X3  375120801 76539384 126   4.901  <.0001

 X1 - X2  184485951 76539384 126   2.410  0.0803

 X1 - X3  170977210 76539384 126   2.234  0.1198

 X2 - X3  -13508741 76539384 126  -0.176  0.9980



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=1.50705608598e-06) change in total_distance overall.

Significant (p=0.00289926966813) change in First vs. Second.
No Significant (p=0.118054271809) change in Second vs. Third.
No Significant (p=0.999999956074) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_distance

  Effect          df     MSE         F ges p.value

1  trial 2.12, 89.02 3675.59 14.99 *** .19  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast    estimate       SE  df t.ratio p.value

 X0 - X1  39.13997056 10.99042 126   3.561  0.0029

 X0 - X2  63.76348317 10.99042 126   5.802  <.0001

 X0 - X3  63.70903721 10.99042 126   5.797  <.0001

 X1 - X2  24.62351261 10.99042 126   2.240  0.1181

 X1 - X3  24.56906665 10.99042 126   2.235  0.1193

 X2 - X3  -0.05444596 10.99042 126  -0.005  1.0000



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=1.29955716963e-05) change in fd_space overall.

No Significant (p=0.0703646532146) change in First vs. Second.
No Significant (p=0.0556539822379) change in Second vs. Third.
No Significant (p=0.922927715654) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: fd_space

  Effect           df  MSE         F ges p.value

1  trial 2.66, 111.73 0.00 10.23 *** .13  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast     estimate         SE  df t.ratio p.value

 X0 - X1   0.034733896 0.01408652 126   2.466  0.0704

 X0 - X2   0.070812403 0.01408652 126   5.027  <.0001

 X0 - X3   0.061968001 0.01408652 126   4.399  0.0001

 X1 - X2   0.036078506 0.01408652 126   2.561  0.0557

 X1 - X3   0.027234105 0.01408652 126   1.933  0.2194

 X2 - X3  -0.008844402 0.01408652 126  -0.628  0.9229



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=1.93780101121e-05) change in lacunarity_space overall.

Significant (p=0.0313180857371) change in First vs. Second.
No Significant (p=0.0627324305003) change in Second vs. Third.
No Significant (p=0.861508063824) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: lacunarity_space

  Effect          df  MSE         F ges p.value

1  trial 2.31, 97.07 0.17 11.01 *** .14  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast    estimate         SE  df t.ratio p.value

 X0 - X1   0.21950866 0.07891486 126   2.782  0.0313

 X0 - X2   0.41781668 0.07891486 126   5.295  <.0001

 X0 - X3   0.35593181 0.07891486 126   4.510  0.0001

 X1 - X2   0.19830802 0.07891486 126   2.513  0.0627

 X1 - X3   0.13642316 0.07891486 126   1.729  0.3132

 X2 - X3  -0.06188486 0.07891486 126  -0.784  0.8615



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.00151542629395) change in total_time overall.

No Significant (p=0.0599806544096) change in First vs. Second.
No Significant (p=0.999791621281) change in Second vs. Third.
No Significant (p=0.524488650342) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_time

  Effect           df                   MSE       F ges p.value

1  trial 2.94, 123.27 104536036030235760.00 5.50 ** .08    .002

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1  174571573 68969829 126   2.531  0.0600

 X0 - X2  180317636 68969829 126   2.614  0.0486

 X0 - X3  274364312 68969829 126   3.978  0.0007

 X1 - X2    5746063 68969829 126   0.083  0.9998

 X1 - X3   99792739 68969829 126   1.447  0.4727

 X2 - X3   94046676 68969829 126   1.364  0.5245



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.00399476260942) change in total_distance overall.

Significant (p=0.00484129700427) change in First vs. Second.
No Significant (p=0.945967631162) change in Second vs. Third.
No Significant (p=0.97744243962) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_distance

  Effect           df     MSE       F ges p.value

1  trial 2.52, 105.90 1835.65 5.15 ** .08    .004

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1  28.856717 8.471199 126   3.406  0.0048

 X0 - X2  24.184211 8.471199 126   2.855  0.0256

 X0 - X3  27.615592 8.471199 126   3.260  0.0077

 X1 - X2  -4.672506 8.471199 126  -0.552  0.9460

 X1 - X3  -1.241125 8.471199 126  -0.147  0.9989

 X2 - X3   3.431381 8.471199 126   0.405  0.9774



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.0109091609222) change in fd_space overall.

Significant (p=0.0039507789448) change in First vs. Second.
No Significant (p=0.633186724603) change in Second vs. Third.
No Significant (p=0.953050344326) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: fd_space

  Effect          df  MSE      F ges p.value

1  trial 2.20, 92.43 0.01 4.53 * .07     .01

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast     estimate         SE  df t.ratio p.value

 X0 - X1   0.053862685 0.01552903 126   3.469  0.0040

 X0 - X2   0.035356081 0.01552903 126   2.277  0.1090

 X0 - X3   0.043499880 0.01552903 126   2.801  0.0297

 X1 - X2  -0.018506603 0.01552903 126  -1.192  0.6332

 X1 - X3  -0.010362805 0.01552903 126  -0.667  0.9092

 X2 - X3   0.008143798 0.01552903 126   0.524  0.9531



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.0103557898114) change in lacunarity_space overall.

Significant (p=0.00497700030726) change in First vs. Second.
No Significant (p=0.876358686615) change in Second vs. Third.
No Significant (p=0.931312390278) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: lacunarity_space

  Effect          df  MSE      F ges p.value

1  trial 1.87, 78.48 0.23 5.01 * .08     .01

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast    estimate         SE  df t.ratio p.value

 X0 - X1   0.27696207 0.08150891 126   3.398  0.0050

 X0 - X2   0.21580094 0.08150891 126   2.648  0.0447

 X0 - X3   0.26485526 0.08150891 126   3.249  0.0080

 X1 - X2  -0.06116112 0.08150891 126  -0.750  0.8764

 X1 - X3  -0.01210681 0.08150891 126  -0.149  0.9988

 X2 - X3   0.04905431 0.08150891 126   0.602  0.9313



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.000196027040874) change in total_time overall.

No Significant (p=0.987158202593) change in First vs. Second.
Significant (p=0.0265633797946) change in Second vs. Third.
No Significant (p=0.477418771043) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_time

  Effect          df                   MSE        F ges p.value

1  trial 1.86, 78.21 225510699339018592.00 9.99 *** .15   .0002

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1   26908501 80690629 126   0.333  0.9872

 X0 - X2  256205605 80690629 126   3.175  0.0100

 X0 - X3  372340397 80690629 126   4.614  0.0001

 X1 - X2  229297103 80690629 126   2.842  0.0266

 X1 - X3  345431895 80690629 126   4.281  0.0002

 X2 - X3  116134792 80690629 126   1.439  0.4774



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=3.67122201562e-06) change in total_distance overall.

Significant (p=0.0323233235804) change in First vs. Second.
No Significant (p=0.109294413134) change in Second vs. Third.
No Significant (p=0.854441263534) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_distance

  Effect          df     MSE         F ges p.value

1  trial 2.14, 89.77 2860.39 13.76 *** .19  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1  26.967902 9.735995 126   2.770  0.0323

 X0 - X2  49.122183 9.735995 126   5.045  <.0001

 X0 - X3  56.907885 9.735995 126   5.845  <.0001

 X1 - X2  22.154280 9.735995 126   2.276  0.1093

 X1 - X3  29.939982 9.735995 126   3.075  0.0136

 X2 - X3   7.785702 9.735995 126   0.800  0.8544



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=6.98207424862e-08) change in fd_space overall.

Significant (p=0.0357192531076) change in First vs. Second.
No Significant (p=0.11864938932) change in Second vs. Third.
No Significant (p=0.396596244848) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: fd_space

  Effect           df  MSE         F ges p.value

1  trial 2.53, 106.25 0.00 16.18 *** .21  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate         SE  df t.ratio p.value

 X0 - X1  0.02904113 0.01062726 126   2.733  0.0357

 X0 - X2  0.05282649 0.01062726 126   4.971  <.0001

 X0 - X3  0.06956384 0.01062726 126   6.546  <.0001

 X1 - X2  0.02378535 0.01062726 126   2.238  0.1186

 X1 - X3  0.04052271 0.01062726 126   3.813  0.0012

 X2 - X3  0.01673736 0.01062726 126   1.575  0.3966



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=2.22365713428e-09) change in lacunarity_space overall.

Significant (p=0.00269360827137) change in First vs. Second.
No Significant (p=0.0804058222817) change in Second vs. Third.
No Significant (p=0.817571945383) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: lacunarity_space

  Effect           df  MSE         F ges p.value

1  trial 2.70, 113.46 0.07 18.88 *** .23  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate         SE  df t.ratio p.value

 X0 - X1  0.19045213 0.05315369 126   3.583  0.0027

 X0 - X2  0.31853974 0.05315369 126   5.993  <.0001

 X0 - X3  0.36507121 0.05315369 126   6.868  <.0001

 X1 - X2  0.12808761 0.05315369 126   2.410  0.0804

 X1 - X3  0.17461908 0.05315369 126   3.285  0.0071

 X2 - X3  0.04653147 0.05315369 126   0.875  0.8176



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=0.00445754070255) change in total_time overall.

Significant (p=0.0208625322505) change in First vs. Second.
No Significant (p=0.909496384928) change in Second vs. Third.
No Significant (p=0.457248792484) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_time

  Effect           df                   MSE       F ges p.value

1  trial 2.43, 104.54 198383761228603808.00 5.14 ** .08    .004

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast  estimate       SE  df t.ratio p.value

 X0 - X1  250181136 85485579 129   2.927  0.0209

 X0 - X2  193204389 85485579 129   2.260  0.1130

 X0 - X3  319054756 85485579 129   3.732  0.0016

 X1 - X2  -56976746 85485579 129  -0.667  0.9095

 X1 - X3   68873620 85485579 129   0.806  0.8517

 X2 - X3  125850367 85485579 129   1.472  0.4572



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=3.92070911182e-05) change in total_distance overall.

Significant (p=0.0455551333817) change in First vs. Second.
No Significant (p=0.569779465464) change in Second vs. Third.
No Significant (p=0.541895031068) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: total_distance

  Effect           df     MSE         F ges p.value

1  trial 2.35, 100.91 3630.42 10.09 *** .13  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast estimate       SE  df t.ratio p.value

 X0 - X1  29.98408 11.36132 129   2.639  0.0456

 X0 - X2  44.66116 11.36132 129   3.931  0.0008

 X0 - X3  59.83860 11.36132 129   5.267  <.0001

 X1 - X2  14.67708 11.36132 129   1.292  0.5698

 X1 - X3  29.85452 11.36132 129   2.628  0.0469

 X2 - X3  15.17744 11.36132 129   1.336  0.5419



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=2.30700847261e-05) change in fd_space overall.

No Significant (p=0.167168101394) change in First vs. Second.
No Significant (p=0.664567582089) change in Second vs. Third.
No Significant (p=0.187894310451) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: fd_space

  Effect           df  MSE        F ges p.value

1  trial 2.70, 116.09 0.00 9.55 *** .13  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate         SE  df t.ratio p.value

 X0 - X1  0.02706996 0.01305229 129   2.074  0.1672

 X0 - X2  0.04196972 0.01305229 129   3.216  0.0088

 X0 - X3  0.06826739 0.01305229 129   5.230  <.0001

 X1 - X2  0.01489976 0.01305229 129   1.142  0.6646

 X1 - X3  0.04119743 0.01305229 129   3.156  0.0106

 X2 - X3  0.02629767 0.01305229 129   2.015  0.1879



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________
__________________________________________________
Significant (p=2.52503595988e-06) change in lacunarity_space overall.

No Significant (p=0.0541436242909) change in First vs. Second.
No Significant (p=0.318267160688) change in Second vs. Third.
No Significant (p=0.577652607356) change in Third vs. Fourth.
Anova Table (Type 3 tests)



Response: lacunarity_space

  Effect           df  MSE         F ges p.value

1  trial 2.71, 116.59 0.08 11.60 *** .13  <.0001

---

Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1



Sphericity correction method: GG 

 contrast   estimate         SE  df t.ratio p.value

 X0 - X1  0.15150877 0.05892092 129   2.571  0.0541

 X0 - X2  0.25277096 0.05892092 129   4.290  0.0002

 X0 - X3  0.32815647 0.05892092 129   5.569  <.0001

 X1 - X2  0.10126219 0.05892092 129   1.719  0.3183

 X1 - X3  0.17664770 0.05892092 129   2.998  0.0170

 X2 - X3  0.07538551 0.05892092 129   1.279  0.5777



P value adjustment: tukey method for comparing a family of 4 estimates 

__________________________________________________

Hierarchical Linear Modeling of Navigation vs. Test Variables

This section shows the statistical analysis of the various Study and Test variables and their associated predictive power.


In [21]:
import pandas as pd

# Load all data
data_summary = pd.read_csv(cbe_intermediate_filename)
data_space = pd.read_csv(test_cogrecon_filename, skiprows=1)
data_nav = pd.read_csv(nav_intermediate_filename)
data_nav_context = pd.read_csv(nav_context_intermediate_filename)
data_missassignment_by_context = pd.read_csv(misassignment_intermediate_filename)

# Sort
[df.sort_values(['subID', 'trial'], inplace=True) for df in [data_summary, data_space, data_nav, data_nav_context, data_missassignment_by_context]]

# Confirm subID and trial match across all data
assert all([a==b==c==d==e for a,b,c,d,e in zip(data_summary['subID'].values, 
                                               data_space['subID'].values, 
                                               data_nav['subID'].values, 
                                               np.transpose([name for name, group in data_nav_context.groupby(['subID', 'trial'])['subID']])[0], # Note: The reason this funny business is needed is because this file is broken down further by context - so we have to group out that column
                                               data_missassignment_by_context['subID'].values)]), 'subIDs do not match in intermediate files'
assert all([a==b==c==d==e for a,b,c,d,e in zip(data_summary['trial'].values, 
                                               data_space['trial'].values, 
                                               data_nav['trial'].values, 
                                               np.transpose([name for name, group in data_nav_context.groupby(['subID', 'trial'])['trial']])[1], # Note: The reason this funny business is needed is because this file is broken down further by context - so we have to group out that column
                                               data_missassignment_by_context['trial'].values)]), 'trials do not match in intermediate files'

data = pd.DataFrame()

# Random Factors
data['subID'] = data_space['subID']
data['trial'] = data_space['trial']

# Study Time Factors (independent variables)

# AS) Simple Path Factors
data['space_travelled'] = data_nav['total_distance']
data['time_travelled'] = data_nav['total_time']

# BS) Complex Path Factors
data['fd_space'] = data_nav['fd_space']
data['lacunarity_space'] = data_nav['lacunarity_space']

# Test Time Factors (dependent variables)

# AT) Simple Factors
data['space_misplacement'] = data_space['Original Misplacement']

# BT) Context Factors
data['across_context_boundary_effect'] = data_summary['context_crossing_dist_exclude_wrong_color_pairs']
data['within_context_boundary_effect'] = data_summary['context_noncrossing_dist_exclude_wrong_color_pairs']
data['context_boundary_effect'] = data_summary['context_crossing_dist_exclude_wrong_color_pairs'] - data_summary['context_noncrossing_dist_exclude_wrong_color_pairs']

# CT) Relational Memory Factors
data['accurate_misassignment_space'] = data_space['Accurate Misassignment']

# DT) Relational Memory and Context Factors
data['within_misassignments'] = data_missassignment_by_context['within_misassignments']
data['across_misassignments'] = data_missassignment_by_context['across_misassignments']

if 'generate_intermediate_files' in vars() and generate_intermediate_files:
    data.to_csv(full_dataset_filename)

Relationships of Interest

  • Do Simple or Complex Path Factors (AS, BS) predict Simple Factors (AT), Relational Memory Factors (CT), or Relational Memory and Context Factors (DT),

Import R Modules for HLM

Because Python just isn't the right environment for this sort of stats, we'll use nlme in R using rpy2 as an interface.


In [22]:
from rpy2.robjects import pandas2ri
import rpy2.robjects as robjects
import numpy as np

pandas2ri.activate()

Just in case nlme is not properly installed, we can try to install it. If prompts appear, you'll need to accept the install in order to use the next code sections.


In [23]:
import rpy2.robjects.packages as rpackages
from rpy2.robjects.vectors import StrVector
from rpy2.robjects import r

utils = rpackages.importr('utils')
utils.chooseCRANmirror(ind=1)

# R package names
packnames = ['nlme', 'stats', 'reghelper', 'MuMIn', 'afex']

names_to_install = [x for x in packnames if not rpackages.isinstalled(x)]
if len(names_to_install) > 0:
    utils.install_packages(StrVector(names_to_install))

Now we can convert the data and import the nlme package.


In [24]:
r_dataframe = pandas2ri.py2ri(data)

nlme = rpackages.importr('nlme')
rstats = rpackages.importr('stats')
reghelper = rpackages.importr('reghelper')
mumin = rpackages.importr('MuMIn')
afex = rpackages.importr('afex')

Next we can run the model and print out the t-Table.


In [25]:
# correlation=nlme.corSymm(form=r.formula('~1 | subID/trial'))
random_effects_model = '~ trial | subID'
navigation = 'time_travelled + space_travelled + fd_space + lacunarity_space'
model_formulas = [
    # Misplacement vs. Navigation Models
    'space_misplacement ~ ' + navigation,
    
    # Misassignment vs. Navigation Models
    'accurate_misassignment_space ~ ' + navigation,
]

transform_functions = [lambda x: x, lambda x: x]
                       # np.sqrt, np.cbrt, np.log, lambda x: 1/x, lambda x: x
data_transformed = data.copy(True)
for c, tf in zip(['space_misplacement', 
              'accurate_misassignment_space',
              'within_misassignments', 'across_misassignments', 'context_boundary_effect'], transform_functions):
    data_transformed[c] = data_transformed[c].apply(tf)
r_transformed_dataframe = pandas2ri.py2ri(data_transformed)

models = [nlme.lme(r.formula(model_formula),
                   random=r.formula(random_effects_model), 
                   data=r_transformed_dataframe, 
                   control=nlme.lmeControl(maxIter=100, msMaxIter=100, opt='optim'), # Note: 'optim' is needed to avoid failure to converge
                   **{'na.action': 'na.omit'} # Other options can be found here: https://stat.ethz.ch/R-manual/R-devel/library/stats/html/na.fail.html
                  )
          for model_formula in model_formulas]

# Uncomment this to view all the possible keys
# print(r.summary(models[0]).names)

# We can pick certain keys to ignore during printing
ignore_keys = ['modelStruct', 'dims', 'contrasts', 'coefficients', 'varFix',
               'sigma', 'apVar', 'numIter', 'groups', 'logLik', 
               'call', 'terms', 'fitted', 'method', 'residuals', 
               'fixDF', 'na.action', 'data' , 'corFixed', # 'tTable'
               'BIC', 'AIC'
              ]

for model, name in zip(models, model_formulas):
    for key in r.summary(model).names:
        if key not in ignore_keys:
            print("_"*len(name))
            print(name)
            print("_"*len(name))
            print(r.summary(model).rx2(key))
            print(reghelper.beta(model).rx2('tTable'))


___________________________________________________________________________________
space_misplacement ~ time_travelled + space_travelled + fd_space + lacunarity_space
___________________________________________________________________________________
                         Value    Std.Error  DF    t-value      p-value

(Intercept)      -1.565284e+02 4.550451e+01 128 -3.4398438 0.0007862773

time_travelled   -6.019638e-10 7.207787e-10 128 -0.8351576 0.4051851342

space_travelled  -3.256128e-02 1.387510e-02 128 -2.3467425 0.0204733937

fd_space          9.864444e+01 2.620222e+01 128  3.7647361 0.0002529549

lacunarity_space  1.289953e+01 6.901561e+00 128  1.8690736 0.0638979077

                         Value  Std.Error  DF    t-value      p-value

(Intercept)        -0.31742022 0.08678691 128 -3.6574666 0.0003707672

time_travelled.z   -0.04875225 0.05837491 128 -0.8351576 0.4051851341

space_travelled.z  -0.54915126 0.23400575 128 -2.3467425 0.0204733937

fd_space.z          0.50670381 0.13459212 128  3.7647361 0.0002529549

lacunarity_space.z  0.36270875 0.19405804 128  1.8690736 0.0638979077

_____________________________________________________________________________________________
accurate_misassignment_space ~ time_travelled + space_travelled + fd_space + lacunarity_space
_____________________________________________________________________________________________
                         Value    Std.Error  DF    t-value      p-value

(Intercept)      -7.146167e+01 1.847584e+01 128 -3.8678446 0.0001738997

time_travelled   -1.960450e-10 2.973059e-10 128 -0.6594053 0.5108201807

space_travelled  -1.611610e-02 5.364471e-03 128 -3.0042299 0.0032038156

fd_space          3.461863e+01 1.043304e+01 128  3.3181734 0.0011800282

lacunarity_space  7.532572e+00 2.701993e+00 128  2.7877832 0.0061164749

                         Value  Std.Error  DF    t-value    p-value

(Intercept)        -0.01336484 0.10360857 128 -0.1289936 0.89756521

time_travelled.z   -0.04184056 0.06345726 128 -0.6593503 0.51085536

space_travelled.z  -0.71618729 0.23841778 128 -3.0039173 0.00320689

fd_space.z          0.46859508 0.14123000 128  3.3179570 0.00118087

lacunarity_space.z  0.55804426 0.20021461 128  2.7872304 0.00612631

We can see the overall fit of the various models in this next section.


In [26]:
r_squared_values = np.array([np.array(mumin.r_squaredGLMM_lme(model)) for model in models]).transpose()
values = [model_formulas, r_squared_values[0], r_squared_values[1]]
columns = ["Model Formulas", "Marginal R^2 (Fixed Effects)", "Conditional R^2 (Fixed + Random Effects)"]
r_squared_data_frame = pandas.DataFrame({c: v for c, v in zip(columns, values)})
r_squared_data_frame


Out[26]:
Conditional R^2 (Fixed + Random Effects) Marginal R^2 (Fixed Effects) Model Formulas
0 0.726450 0.145448 space_misplacement ~ time_travelled + space_tr...
1 0.678052 0.158115 accurate_misassignment_space ~ time_travelled ...

We can also plot the random coefficients for subID and trial. First we'll extract the data then plot.


In [27]:
import matplotlib.pyplot as plt
%matplotlib inline

num_rows = int(np.ceil(len(models)/3.))
fig, axes = plt.subplots(num_rows, 3)
fig.set_size_inches((15, num_rows*5))
if num_rows > 1:
    axes = [item for sublist in axes for item in sublist]

for idx, (model, ax) in enumerate(zip(models, axes)):
    terms = r.summary(model).rx2('terms').r_repr()
    c = str(terms.split('~')[0].strip())

    title = 'model #{0} ({1})'.format(idx, terms, c)
    ax.set_title(title[:30] + '...')
    ax.set_ylabel(c)
    ax.set_xlabel('trial')
    
    #yd = np.transpose([np.array(pandas.Series(data[c][data['trial'] == n]).fillna(method='backfill')) for n in [0, 1, 2, 3]])
    yd = np.transpose([data_transformed[c][data_transformed['trial'] == n] for n in [0, 1, 2, 3]])
    # Get the subject coefficients
    subject_lines = pandas2ri.ri2py(r.summary(model).rx2('coefficients').rx2('random').rx2('subID'))
    x = np.array([0, 1, 2, 3])
    ys = [x*slope + intercept for intercept, slope in subject_lines]
    
    # Get the mean line from the t table
    # Note: This section is commented out because I didn't properly look for the variable of interest, 
    # so the index may change depending on the model
    #
    # mean_line = pandas2ri.ri2py(r.summary(model).rx2('tTable'))
    # mean = x*mean_line[1][0] + mean_line[1][1]
    # plt.plot(list(x), list(mean), c='r', marker='>')
    
    # Plot all the lines
    for y, label, y2 in zip(ys, list(np.linspace(0.0, 1.0, len(ys))), yd):
        ax.plot(list(x), list(y), c='b', marker='>', alpha=0.25)
        ax.plot(list(x), y2 - np.nanmean(yd, axis=0), c='g', alpha=0.1)

    

# Clean up empty extra subplots
for idx in range(1, 3-len(models)%3 + 1):
    axes[-idx].axis('off')
    
plt.show()


We can visualize the correlations as well.


In [28]:
import matplotlib.pyplot as plt

num_rows = int(np.ceil(len(models)/3.))
fig, axes = plt.subplots(num_rows, 3)
fig.set_size_inches((15, num_rows*5))
if num_rows > 1:
    axes = [item for sublist in axes for item in sublist]

print_buffer = ''

for model, name, ax in zip(models, model_formulas, axes):
    print_buffer += str("_"*len(name)) +'\r\n'
    print_buffer += str(name)+'\r\n'
    print_buffer += str("_"*len(name))+'\r\n'
    print_buffer += str(r.summary(model).rx2('corFixed'))+'\r\n'
    correlations = pandas2ri.ri2py(r.summary(model).rx2('corFixed'))
    ax.imshow(correlations)

# Clean up empty extra subplots
for idx in range(1, 3-len(models)%3 + 1):
    axes[-idx].axis('off')

plt.show()

print(print_buffer)


___________________________________________________________________________________
space_misplacement ~ time_travelled + space_travelled + fd_space + lacunarity_space
___________________________________________________________________________________
                 (Intercept) time_travelled space_travelled    fd_space

(Intercept)        1.0000000    -0.29439039       0.9580580 -0.58064506

time_travelled    -0.2943904     1.00000000      -0.1602716  0.38271778

space_travelled    0.9580580    -0.16027160       1.0000000 -0.51268465

fd_space          -0.5806451     0.38271778      -0.5126847  1.00000000

lacunarity_space  -0.8378927     0.08039419      -0.8436997  0.04376613

                 lacunarity_space

(Intercept)           -0.83789268

time_travelled         0.08039419

space_travelled       -0.84369972

fd_space               0.04376613

lacunarity_space       1.00000000

_____________________________________________________________________________________________
accurate_misassignment_space ~ time_travelled + space_travelled + fd_space + lacunarity_space
_____________________________________________________________________________________________
                 (Intercept) time_travelled space_travelled   fd_space

(Intercept)        1.0000000     -0.2930915       0.9592377 -0.6188087

time_travelled    -0.2930915      1.0000000      -0.1670133  0.3552206

space_travelled    0.9592377     -0.1670133       1.0000000 -0.5566997

fd_space          -0.6188087      0.3552206      -0.5566997  1.0000000

lacunarity_space  -0.8442151      0.1069675      -0.8476375  0.1032831

                 lacunarity_space

(Intercept)            -0.8442151

time_travelled          0.1069675

space_travelled        -0.8476375

fd_space                0.1032831

lacunarity_space        1.0000000


And we can compare multiple models of interest. For instance, if we create a larger model that combines all simple and complex navigation variables, we can see if the fit is better than if we use restricted models.


In [29]:
# Note: One interesting example of model comparisons is comparing the spatial and temporal path metrics predictive power
# over spatial or temporal misplacement. i.e.:
#
# model_0 = 'space_misplacement ~ (space_travelled + fd_space + lacunarity_space)'
# model_1 = 'space_misplacement ~ (time_travelled + fd_time + lacunarity_time)'
# and
# model_0 = 'time_misplacement ~ (space_travelled + fd_space + lacunarity_space)'
# model_1 = 'time_misplacement ~ (time_travelled + fd_time + lacunarity_time)'
#
# In both cases, the temporal path metrics resulted in a better fit than the spatial ones, 
# even when the test metric was spatial.

# Define two models to compare
model_0 = 'space_misplacement ~ space_travelled'
model_1 = model_formulas[0]

# Make a list of formulas to pass to the model function
step_mf = [model_0, model_1]

step_models = [nlme.lme(r.formula(model_formula),
                   random=r.formula(random_effects_model), 
                   data=r_transformed_dataframe, 
                   control=nlme.lmeControl(maxIter=100, msMaxIter=100, opt='optim'), # Note: 'optim' is needed to avoid failure to converge
                   **{'na.action': 'na.omit'} # Other options can be found here: https://stat.ethz.ch/R-manual/R-devel/library/stats/html/na.fail.html
                  )
          for model_formula in step_mf]

# Get the model results and run an anova to compare the models
m0, m1 = step_models
result = r.anova(m0, m1)

# Convert the output of the anova into a Pandas DataFrame for convenience
summary_frame = pandas.DataFrame(np.transpose(np.concatenate([[[step_mf[0], step_mf[1]]], np.array(result[2:])])))
for idx, c in enumerate(summary_frame): # Coerce to numeric all but the first column
    if idx != 0: summary_frame[c] = pandas.to_numeric(summary_frame[c], errors='coerce')
    
# Name the columns for convenience
summary_frame.columns = ['Model Formula', 'Df', 'AIC', 'BIC', 'logLik', 'Model', 'Chisq', 'P Value'][0:len(summary_frame.columns)]

print("_"*50 + "\r\nANOVA Summary")
print(summary_frame)

print("_"*50 + "\r\nResult\r\n" + "_"*50)

# If the P Value is significant, check the BIC to determine which model was a better fit (lower BIC is better fit)
best_model_idx = -1
if len(summary_frame.columns) == 5:
    print('The Models are identitcal.')
elif summary_frame['P Value'][1] < 0.05:
    print('The models were significantly different.')
    if summary_frame['BIC'][0] > summary_frame['BIC'][1]:
        print("The second model ({0}) is significantly better than the first.".format(step_mf[1]))
        best_model_idx = 1
    else:
        print("The first model ({0}) is significantly better than the second.".format(step_mf[0]))
        best_model_idx = 0
else:
    print('The models were NOT significantly different.')

print("_"*50 + "\r\nBest Model t Table(s)\r\n" + "_"*50)

if best_model_idx == -1:
    print(r.summary(m0).rx2('tTable'))
    print(r.summary(m1).rx2('tTable'))
    print("_"*50 + "\r\nBest Model Beta Table(s)\r\n" + "_"*50)
    print(reghelper.beta(m0).rx2('tTable'))
    print(reghelper.beta(m1).rx2('tTable'))
    best_model = step_models[0]
else:
    print(r.summary(step_models[best_model_idx]).rx2('tTable'))
    print("_"*50 + "\r\nBest Model Beta Table(s)\r\n" + "_"*50)
    print(reghelper.beta(step_models[best_model_idx]).rx2('tTable'))
    best_model = step_models[best_model_idx]


__________________________________________________
ANOVA Summary
                                       Model Formula   Df          AIC  \
0               space_misplacement ~ space_travelled  6.0  1198.508462   
1  space_misplacement ~ time_travelled + space_tr...  9.0  1208.854123   

           BIC      logLik  Model     Chisq   P Value  
0  1217.462794 -593.254231    1.0       NaN       NaN  
1  1237.129095 -595.427062    2.0  4.345661  0.226477  
__________________________________________________
Result
__________________________________________________
The models were NOT significantly different.
__________________________________________________
Best Model t Table(s)
__________________________________________________
                     Value   Std.Error  DF  t-value      p-value

(Intercept)     3.89654349 1.509100672 131 2.582030 1.092111e-02

space_travelled 0.01566543 0.003325646 131 4.710494 6.214527e-06

                         Value    Std.Error  DF    t-value      p-value

(Intercept)      -1.565284e+02 4.550451e+01 128 -3.4398438 0.0007862773

time_travelled   -6.019638e-10 7.207787e-10 128 -0.8351576 0.4051851342

space_travelled  -3.256128e-02 1.387510e-02 128 -2.3467425 0.0204733937

fd_space          9.864444e+01 2.620222e+01 128  3.7647361 0.0002529549

lacunarity_space  1.289953e+01 6.901561e+00 128  1.8690736 0.0638979077

__________________________________________________
Best Model Beta Table(s)
__________________________________________________
                       Value  Std.Error  DF   t-value      p-value

(Intercept)       -0.3826419 0.09583351 131 -3.992778 1.082195e-04

space_travelled.z  0.2642001 0.05608756 131  4.710494 6.214527e-06

                         Value  Std.Error  DF    t-value      p-value

(Intercept)        -0.31742022 0.08678691 128 -3.6574666 0.0003707672

time_travelled.z   -0.04875225 0.05837491 128 -0.8351576 0.4051851341

space_travelled.z  -0.54915126 0.23400575 128 -2.3467425 0.0204733937

fd_space.z          0.50670381 0.13459212 128  3.7647361 0.0002529549

lacunarity_space.z  0.36270875 0.19405804 128  1.8690736 0.0638979077

Note that the Beta coefficients can be used to determine the relative imporance of the various predictor variables in the chosen model. The larger the beta coefficient (i.e. the first column in the Beta Tables) the better a predictor the independent variable is of the dependent variable.

Finally, we can test for normality of our residuals to confirm that the model is appropriately capturing the effect.


In [30]:
%matplotlib inline
import matplotlib.pyplot as plt
import scipy.stats as stats
import matplotlib.mlab as mlab

def residual_plot_1d(model, level=0, label=''):
    f, (ax0, ax1) = plt.subplots(1, 2)
    f.set_size_inches(15., 5.)
    
    data = np.array(r.residuals(model, level=level))

    n, bins, patches = ax0.hist(data, normed=1,)
    (mu, sigma) = stats.norm.fit(data)
    y = mlab.normpdf( bins, mu, sigma)
    l = ax0.plot(bins, y, 'r--', linewidth=1)

    res = stats.probplot(data, dist=stats.loggamma, sparams=(2.5,), plot=ax1)
    
    ax0.set_xlabel(label)
    ax0.set_ylabel('Probability')
    ax0.set_title(r'$\mathrm{Histogram\ of\ IQ:}\ ' + '\mu={0},\ \sigma={1}$'.format(mu, sigma))
    ax0.grid(True)
    ax1.grid(True)
    
    return data

mfs = model_formulas # [model_formulas[x] for x in [0, 1, 3, 5, 6]]

norm_models = [nlme.lme(r.formula(model_formula),
                   random=r.formula(random_effects_model), 
                   data=r_transformed_dataframe, 
                   control=nlme.lmeControl(maxIter=100, msMaxIter=100, opt='optim'), # Note: 'optim' is needed to avoid failure to converge
                   **{'na.action': 'na.omit'} # Other options can be found here: https://stat.ethz.ch/R-manual/R-devel/library/stats/html/na.fail.html
                  )
          for model_formula in mfs]

for model, form in zip(norm_models, mfs):
    model_under_test = model
    print(form)
    res0 = residual_plot_1d(model_under_test, level=0)
    res1 = residual_plot_1d(model_under_test, level=1)

    k2, p = stats.normaltest(res0)
    print('k2 = {0}, p = {1}'.format(k2, p))
    if p < 0.05:
        print('The residuals were NOT normal.')
    else:
        print('The residuals were normal.')
    k2, p = stats.normaltest(res1)
    print('k2 = {0}, p = {1}'.format(k2, p))
    if p < 0.05:
        print('The residuals were NOT normal.')
    else:
        print('The residuals were normal.')

    plt.show()


space_misplacement ~ time_travelled + space_travelled + fd_space + lacunarity_space
k2 = 25.1274679077, p = 3.49654942669e-06
The residuals were NOT normal.
k2 = 5.96586014747, p = 0.0506442249404
The residuals were normal.
accurate_misassignment_space ~ time_travelled + space_travelled + fd_space + lacunarity_space
k2 = 9.67357555328, p = 0.00793249411303
The residuals were NOT normal.
k2 = 0.883840230905, p = 0.64280098327
The residuals were normal.

Animate trial-over-trial relationship between two variables

We can animate the relationship between two variables using the following code. Note that the next code block will quickly generate the plot object, but we visualize it in the following cell (which can take some time to run).


In [31]:
%%capture

import matplotlib.pyplot as plt
from matplotlib import animation, rc
from IPython.display import HTML
import scipy.stats as stats
import os
import IPython.display as display


import statsmodels.api as smapi
from statsmodels.formula.api import ols
import statsmodels.graphics as smgraphics

rc('animation', html='html5')
%matplotlib inline

relations = [
    ['space_misplacement', 'space_travelled'],
    ['space_misplacement', 'fd_space'],
    ['accurate_misassignment_space', 'space_travelled'],
    ['accurate_misassignment_space', 'fd_space'],
    ['accurate_misassignment_space', 'lacunarity_space']
]

def get_animation_from_data(xs, ys, title='', x_label='', y_label='', animate=True):
    def update_plot(t):
        if t+1 >= len(xs): # For pausing at the end
            t = 2.99
        index = int(np.floor(t))
        nt = t % 1.0
        
        newPoints = np.transpose([xs[index+1], ys[index+1]])
        originalPoints = np.transpose([xs[index], ys[index]])
        interpolation = originalPoints*(1-nt) + newPoints*nt
        
        scat.set_offsets(interpolation)
        
        line_x = np.linspace(np.min(xs), np.max(xs), 10)
        newLine = np.transpose([line_x, np.array([fits[index+1][0]*x + fits[index+1][1] for x in line_x])])
        originalLine = np.transpose([line_x, np.array([fits[index][0]*x + fits[index][1] for x in line_x])])
        interpolation = originalLine*(1-nt) + newLine*nt
        
        line_plot[0].set_xdata(np.transpose(interpolation)[0])
        line_plot[0].set_ydata(np.transpose(interpolation)[1])
        
        r = fits[index][2]*(1-nt) + fits[index+1][2]*nt
        p = fits[index][3]*(1-nt) + fits[index+1][3]*nt
        
        txt.set_text('p=' + "{:.2f}".format(p) + '\n' + 'r=' + "{:.2f}".format(r))
        
        return scat,
    
    if animate:
        fig = plt.figure()
        ax = plt.gca()
    else:
        fig = plt.figure()
        ax1 = fig.add_subplot(221)
        ax2 = fig.add_subplot(222)
        ax3 = fig.add_subplot(223)
        ax4 = fig.add_subplot(224)
        axs = [ax1, ax2, ax3, ax4]
        ax = axs[0]
        for ax in axs:
            ax.set_xlabel(x_label)
            ax.set_ylabel(y_label)
    
    masks = [[ ~np.isnan(varx) & ~np.isnan(vary) for varx, vary in zip(x, y)] for x, y in zip(xs, ys)]
    
    for idx, (x, y, mask) in enumerate(zip(xs, ys, masks)):
        regression = ols("data ~ x", data=dict(data=y[mask], x=x[mask])).fit()
        test = regression.outlier_test()
        outliers = []
        try:
            outliers = ((i, x[i],y[i]) for i,t in enumerate(test['bonf(p)']) if t < 0.5)
            print 'Outliers: ', list(outliers)
        except:
            pass
        for outlier in outliers:
            del masks[idx][outerliers[0]]
            del xs[idx][outerliers[0]]
            del xs[idx][outerliers[0]]
        

    
    fits = [stats.linregress(x[mask], y[mask]) for x, y, mask in zip(xs, ys, masks)]
    
    scat = plt.scatter([], [], color='white', edgecolors ='black')
    
    if animate:
        txt = plt.text(1, 0, '', fontsize=12, ha='right', va='bottom', transform=ax.transAxes)
        line_plot = plt.plot([], [], color='black')
        anim = animation.FuncAnimation(fig, update_plot, frames=np.arange(0, 4, 0.01), interval=33)
        plt.title(title)
        [plt.scatter(x, y, label=idx+1, alpha=0.5) for idx, (x, y) in enumerate(zip(xs, ys))]
        plt.legend(loc=2)
        plt.tight_layout()
    else:
        plt.suptitle(title)
        fig.set_size_inches((15, 8))
        for idx, (x, y, ax, fit) in enumerate(zip(xs, ys, axs, fits)):
            ax.set_title('Trial {0}'.format(idx+1))
            txt = plt.text(1, 0, '', fontsize=12, ha='right', va='bottom', transform=ax.transAxes)
            ax.scatter(x, y, label=idx+1, alpha=0.5)
            line_plot = plt.plot([], [], color='black')
            line_x = np.linspace(np.min(x), np.max(x), 10)
            line_y = np.array([fit[0]*x + fit[1] for x in line_x])
            ax.plot(line_x, line_y, 'k')
            r = fit[2]
            p = fit[3]
            txt.set_text('p=' + "{:.2f}".format(p) + '\n' + 'r=' + "{:.2f}".format(r))
            plt.tight_layout()
    
    if animate:
        plt.close()
        return anim

def get_animation(relation, animate=True):
    xs = [data[relation[1]][data['trial'] == n] for n in [0, 1, 2, 3]]
    ys = [data[relation[0]][data['trial'] == n] for n in [0, 1, 2, 3]]

    return get_animation_from_data(xs, ys, title=relation[1].replace('_',' ').title() + ' vs. ' + relation[0].replace('_',' ').title(), x_label=relation[1].replace('_',' ').title(), y_label=relation[0].replace('_',' ').title(), animate=animate)

# Note: you'll need to change the ffmpeg path to use this on your machine
def embed_animation_as_gif(_animation, ffmpeg_path=r'C:\mingw2\bin\ffmpeg.exe'):
    if not os.path.exists(ffmpeg_path):
        return _animation
    _animation.save('animation.mp4')
    os.system("ffmpeg -i animation.mp4 animation.gif -y")
    data='0'
    IMG_TAG = '<img src="data:image/gif;base64,{0}">'
    with open('animation.gif', 'rb') as f:
        data = f.read()
        data = data.encode('base64')
    return HTML(IMG_TAG.format(data))

In [32]:
for relation in relations:
    display_obj = get_animation(relation, animate=False)
    if 'generate_embedded_animations' in vars() and generate_embedded_animations and 'embed_animation_as_gif' in vars():
        display_obj = embed_animation_as_gif(display_obj)
    display.display_html(display_obj)


Outliers:  []
Outliers:  Outliers:  Outliers:  Outliers:  []
Outliers:  Outliers:  Outliers:  Outliers:  []
Outliers:  []
Outliers:  Outliers:  Outliers:  []
Outliers:  []
Outliers:  []
Outliers:  Outliers:  []
Outliers:  Outliers:  []
Outliers: 

Plotting the Residuals of the Fit

Next, we can plot the same effect in terms of the residuals after accounting for the random effects.


In [33]:
import matplotlib.pyplot as plt

relations = [
    ['space_misplacement', 'space_travelled'],
    ['space_misplacement', 'fd_space'],
    ['accurate_misassignment_space', 'space_travelled'],
    ['accurate_misassignment_space', 'fd_space'],
    ['accurate_misassignment_space', 'lacunarity_space']
]

relation_models = [models[x] for x in [0, 0, 1, 1, 1]]

def plot_residuals(r0, r1, model, title=''):
    # Make new figure
    plt.figure()
    # Get the subject coefficients
    subject_lines = pandas2ri.ri2py(r.summary(model).rx2('coefficients').rx2('random').rx2('subID'))
    x = np.array([0, 1, 2, 3])
    ys = [x*slope + intercept for intercept, slope in subject_lines]
    
    # Get the relation data
    r0_data = [data[r0][data['trial'] == n] for n in list(x)]
    r1_data = [data[r1][data['trial'] == n] for n in list(x)]
    
    # Compute the residuals of the target variable
    r0_resid = np.transpose(np.array(ys)) - np.array(r0_data)
    
    return get_animation_from_data(r0_resid, r1_data, title=title)

In [ ]:
for relation in relations:
    r0, r1 = relation
    if 'generate_embedded_animations' in vars() and generate_embedded_animations and 'embed_animation_as_gif' in vars():
        display_obj = embed_animation_as_gif(display_obj)
    else:
        display_obj = plot_residuals(r0, r1, relation_models[0], title=r0 + ' vs. ' + r1)
    display.display_html(display_obj)

Visualization

Next, we'll do some basic visualization of Navigation.


In [35]:
from cogrecon.core.data_flexing.time_travel_task.time_travel_task_binary_reader import read_binary_file
import os
import numpy as np
from math import atan2,degrees

def GetAngleOfLineBetweenTwoPoints(p1, p2):
    xDiff = p2['x'] - p1['x']
    yDiff = p2['y'] - p1['y']
    return degrees(atan2(yDiff, xDiff))

In [36]:
data = pd.read_csv(os.path.join('.', output_directory, 'study_path.csv'))
grp = data.groupby(['subject_id', 'trial_number'])

In [37]:
target = (1, 0)
for name, group in grp:
    if name == target:
        x, y = np.transpose(np.array(group[['x', 'z']]))
        break

In [38]:
import matplotlib.pyplot as plt
from matplotlib.patches import Wedge, Rectangle, Circle

fov = 60.0
idx = int(len(x)/1.2)
window = 2.
study_realX = [5, 10, 10, 50, 10, 65, 35, 15, 35, 55, -5, -5, 20, 60, 60, 60]
study_realY = [15, 5, 30, 10, 55, 15, 15, 65, 65, 65, 25, 65, 45, 60, 30, 45]
space_points = list(zip(study_realX, study_realY))
e = 'k'

plt.figure(figsize=(10, 10))
plt.xlabel('Space X')
plt.ylabel('Space Y')
angle = GetAngleOfLineBetweenTwoPoints({'x':x[idx], 'y': y[idx]}, {'x': x[idx+10], 'y': y[idx+10]})
plt.gca().add_artist(Rectangle((-10,0),80,80, color='k', alpha=0.4, fill=False))
plt.gca().add_artist(Rectangle((-10,0),40,80, color='k', alpha=0.4, fill=False))
plt.gca().add_artist(Rectangle((-10,0),80,40, color='k', alpha=0.4, fill=False))
plt.gca().add_artist(Wedge((x[idx], y[idx]), 10, angle-fov, angle+fov, color='r', alpha=0.4))
plt.gca().set_aspect('equal')
for i, sp in enumerate(space_points):
    if i == len(space_points) - 1:
        plt.gca().add_artist(Circle((sp[0], sp[1]), 10, fill=False, alpha=1, edgecolor='k'))
    plt.gca().add_artist(Circle((sp[0], sp[1]), 10, color='g', alpha=0.1, edgecolor=e))
plt.scatter(np.transpose(space_points)[0], np.transpose(space_points)[1], c='g')
plt.scatter([x[idx]], [y[idx]], c='k', alpha=1, zorder=10)
plt.plot(x, y)

plt.show()


C:\Program Files\Anaconda3\envs\iposition\lib\site-packages\matplotlib\patches.py:91: UserWarning: Setting the 'color' property will overridethe edgecolor or facecolor properties. 
  warnings.warn("Setting the 'color' property will override"

In [ ]: